UI Components
Deploy modern AI chat interfaces
Overview
UI components provide modern web interfaces for interacting with your AI models. OnglX Deploy supports OpenWebUI, a feature-rich chat interface that works seamlessly with your API components.
The UI automatically connects to your deployed API endpoints and provides a polished user experience for chat, model management, and user authentication.
Adding a UI Component
Add a UI component to your deployment:
$ onglx-deploy add ui✓ Added ui component with OpenWebUI✓ Configured automatic API discovery✓ Set up user authentication✓ Enabled container scalingComponent details:Type: uiInterface: OpenWebUIRuntime: AWS ECS FargateAuthentication: Built-in user managementStorage: EFS for persistent data
Features
- • Modern chat UI with dark/light themes
- • Conversation history and persistence
- • File upload and document chat
- • Model switching and configuration
- • Built-in user authentication
- • Role-based access control
- • User registration and profiles
- • Session management
- • Secure API key management
- • Data encryption at rest
- • Private conversation storage
- • HTTPS and security headers
- • Automatic model discovery
- • Custom model configuration
- • Temperature and parameter tuning
- • Model usage analytics
Deployment Architecture
UI components run on AWS ECS Fargate with the following setup:
Compute
- • AWS ECS Fargate containers
- • Auto-scaling based on demand
- • Load balancer with SSL termination
- • Health checks and rolling updates
Storage
- • Amazon EFS for persistent data
- • User conversations and files
- • Configuration and preferences
- • Automatic backups
Configuration
The UI component automatically configures itself to work with your API components:
1# Your deploy.yml configuration
2components:
3 api:
4 type: api
5 models: ["anthropic.claude-3-sonnet-20240229-v1:0"]
6
7 ui:
8 type: ui
9 interface: openwebui
10 # Automatically discovers API endpoint
11 # Sets up user authentication
12 # Configures model access
Usage Examples
After deployment
$ onglx-deploy deploy✓ Deployed API endpoint: https://api.example.com✓ Deployed UI interface: https://ui.example.comVisit https://ui.example.com to:• Create user accounts• Start chatting with AI models• Upload documents for analysis• Manage conversation history
Accessing your interface
URL: Your UI will be available at the endpoint shown after deployment
First time setup: Create an admin account on first visit
Model access: All your API models are automatically available
Monitoring & Logs
Monitor your UI component with built-in tools:
# Check UI component status$ onglx-deploy statusComponents:[✓] ui/openwebui: Healthy[✓] api/openai: Healthy# View UI component logs$ onglx-deploy logs ui[ui] Starting OpenWebUI server[ui] Connected to API endpoint[ui] User authentication enabled[ui] Ready to serve requests on port 8080
🎉 Ready to Use!
- • Deploy your UI:
onglx-deploy deploy
- • Access your interface at the provided URL
- • Create user accounts and start chatting
- • Troubleshooting guide