Getting Started with OnglX Deploy
Deploy production-ready AI inference APIs to your own AWS cloud in under 3 minutes. OpenAI-compatible endpoints powered by AWS Bedrock.
Quick Start
⚡
Install CLI
Get started with npm, Homebrew, or direct download
npm install -g @onglx/deploy-cli
🚀
Deploy Your First API
Initialize and deploy in minutes
onglx-deploy init && onglx-deploy deploy
Choose Your Path
🚀
New to OnglX
Install OnglX Deploy CLI and get started with AI deployments.
30-second installation
Multiple installation methods
Cloud configuration guide
🤖
AI Developer
Deploy OpenAI-compatible APIs with AWS Bedrock foundation models.
Chat completions API
Multiple Bedrock models
Open WebUI interface
☁️
DevOps Engineer
Infrastructure as code with OpenTofu and AWS best practices.
OpenTofu modules
Concurrent deployments
Safe rollback on SIGINT
Get started
If you're new to OnglX Deploy, start here to learn the essentials and make your first API call.