Quick Start

Deploy your first AI API in 5 minutes

5 minutes
Beginner Friendly

Before You Begin

Make sure you have:

Step-by-Step Tutorial

1
Initialize Your Project

Create a new OnglX Deploy project and configure your AWS settings.

Terminal - OnglX Deploy
$ onglx-deploy init --host aws --region us-east-1
✓ Initialized deployment: my-ai-project
✓ Provider: aws
✓ Region: us-east-1
✓ Config: deploy.yml
Your project is ready! Next, add some components.

This creates a deploy.yml configuration file in your current directory.

2
Add an API Component

Add an OpenAI-compatible API that connects to AWS Bedrock models.

Terminal - OnglX Deploy
$ onglx-deploy add api
✓ Added api component with OpenAI compatibility
✓ Configured AWS Bedrock access
✓ Set up automatic scaling
Component details:
Type: api
Runtime: AWS Lambda
Models: All available Bedrock models
Authentication: API key based

Pro tip: You can add multiple components to the same deployment.

3
Review the Deployment Plan

Check what resources will be created and estimated costs before deploying.

Terminal - OnglX Deploy
$ onglx-deploy plan
Deployment Plan for aws in us-east-1
===================================
Resources to be created:
+ AWS Lambda Function (api-handler)
+ API Gateway REST API (api-gateway)
+ IAM Role (lambda-execution-role)
+ CloudWatch Log Group (api-logs)
Estimated monthly cost: $5.25
• Lambda compute: $3.50
• API Gateway: $1.50
• CloudWatch logs: $0.25
Run 'onglx-deploy deploy' to proceed.
4
Deploy to AWS

Deploy your AI infrastructure to AWS. This typically takes 2-3 minutes.

Terminal - OnglX Deploy
$ onglx-deploy deploy
Deploying to aws in us-east-1
============================
✓ Creating Lambda function
✓ Setting up API Gateway
✓ Configuring Bedrock access
✓ Setting up CloudWatch logging
✓ Creating IAM roles and policies
Deployment completed successfully!
Endpoints:
API: https://abc123xyz.execute-api.us-east-1.amazonaws.com/
API Key: sk-onglx-abc123xyz...
Save this key securely - it won't be shown again.

Important: Save your API key securely. You'll need it to make requests to your API.

5
Test Your API

Make your first API call to verify everything is working correctly.

1curl -X POST https://your-api-endpoint.amazonaws.com/v1/chat/completions \
2  -H "Content-Type: application/json" \
3  -H "Authorization: Bearer sk-onglx-abc123xyz..." \
4  -d '{
5    "model": "anthropic.claude-3-sonnet-20240229-v1:0",
6    "messages": [
7      {"role": "user", "content": "Hello! How are you?"}
8    ]
9  }'

Expected response:

1{
2  "id": "chatcmpl-abc123",
3  "object": "chat.completion",
4  "created": 1640995200,
5  "model": "anthropic.claude-3-sonnet-20240229-v1:0",
6  "choices": [
7    {
8      "index": 0,
9      "message": {
10        "role": "assistant",
11        "content": "Hello! I'm doing well, thank you for asking. How can I help you today?"
12      },
13      "finish_reason": "stop"
14    }
15  ],
16  "usage": {
17    "prompt_tokens": 12,
18    "completion_tokens": 20,
19    "total_tokens": 32
20  }
21}

Optional: Add a UI Component

Want a chat interface? Add OpenWebUI to your deployment:

Terminal - OnglX Deploy
$ onglx-deploy add ui
✓ Added ui component with OpenWebUI
✓ Configured automatic API discovery
✓ Set up user authentication
$ onglx-deploy deploy
✓ Deployed UI interface: https://ui.your-domain.com
Visit your UI to create user accounts and start chatting!

Managing Your Deployment

Monitor Status
Terminal - OnglX Deploy
$ onglx-deploy status
Components:
[✓] api/openai: Healthy
[✓] ui/openwebui: Healthy
Endpoints:
API: https://api.example.com
UI: https://ui.example.com
View Logs
Terminal - OnglX Deploy
$ onglx-deploy logs --follow
[api] 2024-01-01T12:00:00Z INFO Request received
[api] 2024-01-01T12:00:01Z INFO Response sent
[ui] 2024-01-01T12:00:02Z INFO User logged in

Using Your API

With OpenAI Python SDK

1import openai
2
3client = openai.OpenAI(
4    api_key="sk-onglx-abc123xyz...",
5    base_url="https://your-api-endpoint.amazonaws.com"
6)
7
8response = client.chat.completions.create(
9    model="anthropic.claude-3-sonnet-20240229-v1:0",
10    messages=[
11        {"role": "user", "content": "Write a haiku about AI"}
12    ]
13)
14
15print(response.choices[0].message.content)

Available Models

Your API supports all AWS Bedrock foundation models:

Anthropic Claude 3

  • • claude-3-haiku-*
  • • claude-3-sonnet-*
  • • claude-3-opus-*

Amazon Titan

  • • titan-text-express-v1
  • • titan-embed-text-v1

Meta Llama

  • • llama2-70b-chat-v1
  • • code-llama-*

Clean Up (Optional)

When you're done experimenting, you can destroy your deployment to avoid charges:

Terminal - OnglX Deploy
$ onglx-deploy destroy
⚠️ This will destroy all resources created by this deployment.
This action cannot be undone!
Do you really want to destroy? Type 'yes' to confirm: yes
Destroying infrastructure in aws (us-east-1)
✓ Infrastructure destroyed successfully

🎉 Congratulations!

You've successfully deployed your first AI API with OnglX Deploy! Your API is now running on AWS with OpenAI compatibility and automatic scaling.

What you've accomplished:

  • • Deployed AWS Lambda-based API
  • • Set up AWS Bedrock integration
  • • Created OpenAI-compatible endpoints
  • • Configured automatic scaling
  • • Established secure authentication

Next steps: