Deploy experimental OpenAI-compatible APIs to AWS Bedrock with Claude models. Early development tool with limited functionality.
Deploy experimental OpenAI-compatible APIs to AWS Bedrock only. GCP support and additional domains are planned for future releases.
Deploy production-ready OpenAI-compatible AI APIs to AWS Bedrock or GCP Vertex AI with Claude 3.5 Sonnet, Gemini, and Amazon Titan models.
# Deploy Basic AI API to AWS Bedrock (Experimental)$ ./onglx-deploy init --host awsā Created: deploy.yml configurationā Selected: AWS Bedrock (Claude models)ā Region: us-east-1$ ./onglx-deploy add inference --component api --type openaiā Added: OpenAI-compatible API component$ ./onglx-deploy deployā Deploying to AWS using Terraform...ā Creating Lambda functionā Setting up API Gatewayā Configuring Bedrock accessš Basic AI API deployed to AWSCheck status: ./onglx-deploy status
Currently focused on AI-powered applications with OpenAI-compatible APIs. Future versions will support full-stack applications with compute, storage, and database domains.
Start free, upgrade for advanced features. South African pricing (ZAR-optimized). SaaS Platform and Enterprise tiers also available.
Enterprise Features: Self-hosted deployment, advanced security, priority support.
Coming Soon
Build the experimental CLI from source and test basic AWS Bedrock deployment. Early development software with limited functionality.