Quick Start
Deploy your first AI API in 5 minutes
Before You Begin
Make sure you have:
- • OnglX Deploy CLI installed
- • AWS credentials configured
- • AWS Bedrock model access enabled
Step-by-Step Tutorial
Create a new OnglX Deploy project and configure your AWS settings.
$ onglx-deploy init --host aws --region us-east-1✓ Initialized deployment: my-ai-project✓ Provider: aws✓ Region: us-east-1✓ Config: deploy.ymlYour project is ready! Next, add some components.
This creates a deploy.yml
configuration file in your current directory.
Add an OpenAI-compatible API that connects to AWS Bedrock models.
$ onglx-deploy add api✓ Added api component with OpenAI compatibility✓ Configured AWS Bedrock access✓ Set up automatic scalingComponent details:Type: apiRuntime: AWS LambdaModels: All available Bedrock modelsAuthentication: API key based
Pro tip: You can add multiple components to the same deployment.
Check what resources will be created and estimated costs before deploying.
$ onglx-deploy planDeployment Plan for aws in us-east-1===================================Resources to be created:+ AWS Lambda Function (api-handler)+ API Gateway REST API (api-gateway)+ IAM Role (lambda-execution-role)+ CloudWatch Log Group (api-logs)Estimated monthly cost: $5.25• Lambda compute: $3.50• API Gateway: $1.50• CloudWatch logs: $0.25Run 'onglx-deploy deploy' to proceed.
Deploy your AI infrastructure to AWS. This typically takes 2-3 minutes.
$ onglx-deploy deployDeploying to aws in us-east-1============================✓ Creating Lambda function✓ Setting up API Gateway✓ Configuring Bedrock access✓ Setting up CloudWatch logging✓ Creating IAM roles and policiesDeployment completed successfully!Endpoints:API: https://abc123xyz.execute-api.us-east-1.amazonaws.com/API Key: sk-onglx-abc123xyz...Save this key securely - it won't be shown again.
Important: Save your API key securely. You'll need it to make requests to your API.
Make your first API call to verify everything is working correctly.
1curl -X POST https://your-api-endpoint.amazonaws.com/v1/chat/completions \
2 -H "Content-Type: application/json" \
3 -H "Authorization: Bearer sk-onglx-abc123xyz..." \
4 -d '{
5 "model": "anthropic.claude-3-sonnet-20240229-v1:0",
6 "messages": [
7 {"role": "user", "content": "Hello! How are you?"}
8 ]
9 }'
Expected response:
1{
2 "id": "chatcmpl-abc123",
3 "object": "chat.completion",
4 "created": 1640995200,
5 "model": "anthropic.claude-3-sonnet-20240229-v1:0",
6 "choices": [
7 {
8 "index": 0,
9 "message": {
10 "role": "assistant",
11 "content": "Hello! I'm doing well, thank you for asking. How can I help you today?"
12 },
13 "finish_reason": "stop"
14 }
15 ],
16 "usage": {
17 "prompt_tokens": 12,
18 "completion_tokens": 20,
19 "total_tokens": 32
20 }
21}
Optional: Add a UI Component
Want a chat interface? Add OpenWebUI to your deployment:
$ onglx-deploy add ui✓ Added ui component with OpenWebUI✓ Configured automatic API discovery✓ Set up user authentication$ onglx-deploy deploy✓ Deployed UI interface: https://ui.your-domain.comVisit your UI to create user accounts and start chatting!
Managing Your Deployment
$ onglx-deploy statusComponents:[✓] api/openai: Healthy[✓] ui/openwebui: HealthyEndpoints:API: https://api.example.comUI: https://ui.example.com
$ onglx-deploy logs --follow[api] 2024-01-01T12:00:00Z INFO Request received[api] 2024-01-01T12:00:01Z INFO Response sent[ui] 2024-01-01T12:00:02Z INFO User logged in
Using Your API
With OpenAI Python SDK
1import openai
2
3client = openai.OpenAI(
4 api_key="sk-onglx-abc123xyz...",
5 base_url="https://your-api-endpoint.amazonaws.com"
6)
7
8response = client.chat.completions.create(
9 model="anthropic.claude-3-sonnet-20240229-v1:0",
10 messages=[
11 {"role": "user", "content": "Write a haiku about AI"}
12 ]
13)
14
15print(response.choices[0].message.content)
Available Models
Your API supports all AWS Bedrock foundation models:
Anthropic Claude 3
- • claude-3-haiku-*
- • claude-3-sonnet-*
- • claude-3-opus-*
Amazon Titan
- • titan-text-express-v1
- • titan-embed-text-v1
Meta Llama
- • llama2-70b-chat-v1
- • code-llama-*
Clean Up (Optional)
When you're done experimenting, you can destroy your deployment to avoid charges:
$ onglx-deploy destroy⚠️ This will destroy all resources created by this deployment.This action cannot be undone!Do you really want to destroy? Type 'yes' to confirm: yesDestroying infrastructure in aws (us-east-1)✓ Infrastructure destroyed successfully
🎉 Congratulations!
You've successfully deployed your first AI API with OnglX Deploy! Your API is now running on AWS with OpenAI compatibility and automatic scaling.
What you've accomplished:
- • Deployed AWS Lambda-based API
- • Set up AWS Bedrock integration
- • Created OpenAI-compatible endpoints
- • Configured automatic scaling
- • Established secure authentication
Next steps:
- • Explore advanced features
- • Learn more CLI commands
- • Troubleshooting guide
- • Build amazing AI applications!