AI Inference Guide
Deploy AI APIs and interfaces with OnglX Deploy
AI/ML
AWS Bedrock
Overview
OnglX Deploy simplifies deploying AI inference APIs and user interfaces to AWS Bedrock. You can deploy OpenAI-compatible APIs and modern web interfaces like OpenWebUI with just a few commands.
The platform handles all the infrastructure complexity, letting you focus on building great AI experiences.
Available Components
API Components
Deploy OpenAI-compatible inference APIs that connect to AWS Bedrock models.
- • OpenAI API compatibility
- • AWS Bedrock integration
- • Automatic scaling
- • Built-in authentication
UI Components
Deploy modern chat interfaces and AI playgrounds for your users.
- • OpenWebUI interface
- • Chat playground
- • Model management
- • User authentication
Quick Start
Deploy both API and UI components in minutes:
Terminal - OnglX Deploy
# Initialize your project$ onglx-deploy init --host aws --region us-east-1# Add API component$ onglx-deploy add api✓ Added api component with OpenAI compatibility# Add UI component$ onglx-deploy add ui✓ Added ui component with OpenWebUI# Deploy everything$ onglx-deploy deploy✓ Deployed API endpoint: https://api.example.com✓ Deployed UI interface: https://ui.example.com
Model Support
OnglX Deploy supports all AWS Bedrock foundation models:
Anthropic
- • Claude 3 Haiku
- • Claude 3 Sonnet
- • Claude 3 Opus
Amazon
- • Titan Text
- • Titan Embeddings
Meta
- • Llama 2
- • Code Llama