A CLI tool for deploying OpenAI-compatible APIs to AWS Bedrock with Claude models. Simple, focused, and production-ready infrastructure automation.
A comprehensive set of commands for managing AI inference deployments on AWS Bedrock.
Deploy AI inference APIs to AWS Bedrock in four simple steps using our CLI tool.
# Initialize new project$ onglx-deploy init --host aws --region us-east-1✓ Initialized deployment: my-project✓ Provider: aws✓ Region: us-east-1✓ Config: deploy.yml# Add API component$ onglx-deploy add inference --component api --type openai✓ Added api/openai component✓ Model: anthropic.claude-3-5-sonnet-20241022-v2:0# Review deployment plan$ onglx-deploy plan --show-costs📋 Deployment Plan📊 Plan Summary: Additions: 3, Changes: 0💰 Total Monthly: $9.00 USD# Deploy to AWS$ onglx-deploy deploy✓ Deployment completed successfully!Endpoints:API: https://abc123.execute-api.us-east-1.amazonaws.com/
Currently focused on AI-powered applications with OpenAI-compatible APIs. Future versions will support full-stack applications with compute, storage, and database domains.
Start free, upgrade for advanced features. South African pricing (ZAR-optimized). SaaS Platform and Enterprise tiers also available.
Enterprise Features: Self-hosted deployment, advanced security, priority support.
Coming Soon
Build the experimental CLI from source and test basic AWS Bedrock deployment. Early development software with limited functionality.