Overview

Get started with OnglX Deploy

CLI Tool
AWS Bedrock
Open Source

What is OnglX Deploy?

OnglX Deploy is a CLI-first, open-core platform that lets developers deploy AI inference APIs and modern web apps to their own cloud accounts with a Vercel-like developer experience.

It simplifies the deployment of OpenAI-compatible inference APIs that connect to AWS Bedrock, along with modern chat interfaces like OpenWebUI, all with just a few commands.

Deploy to your own AWS account, maintain full control of your data, and get enterprise-grade infrastructure without the complexity.

Key Features

Deploy to Your Cloud

Deploy directly to your own AWS account. No vendor lock-in, full data control.

  • • Your AWS account, your data
  • • No third-party API dependencies
  • • Enterprise-grade security by default
Simple CLI Interface

Deploy complex AI infrastructure with simple commands. No YAML, no Kubernetes.

  • • One command deployments
  • • Automatic infrastructure management
  • • Built-in rollback and recovery
AWS Bedrock Native

Purpose-built for AWS Bedrock with OpenAI API compatibility.

  • • All Bedrock foundation models
  • • OpenAI-compatible endpoints
  • • Automatic model discovery
Production Ready

Auto-scaling, monitoring, and security built-in from day one.

  • • Auto-scaling infrastructure
  • • CloudWatch monitoring
  • • Security best practices

How It Works

1

Initialize Your Project

Set up a new deployment configuration in seconds.

Terminal - OnglX Deploy
$ onglx-deploy init --host aws --region us-east-1
✓ Initialized deployment: my-ai-project
2

Add Components

Add AI inference APIs and user interfaces with simple commands.

Terminal - OnglX Deploy
$ onglx-deploy add api
$ onglx-deploy add ui
✓ Added OpenAI-compatible API
✓ Added OpenWebUI interface
3

Deploy to AWS

Deploy your complete AI platform with a single command.

Terminal - OnglX Deploy
$ onglx-deploy deploy
✓ Deployed API: https://api.example.com
✓ Deployed UI: https://ui.example.com

Use Cases

💼

Enterprise AI Applications

Deploy private AI inference APIs for internal tools, customer support, content generation, and business process automation while keeping data in your own AWS account.

🚀

AI-Powered Startups

Rapidly prototype and deploy AI features without managing infrastructure complexity. Scale from prototype to production with the same simple CLI commands.

🔬

Research & Development

Experiment with different AI models, build custom interfaces, and share AI tools with team members through secure, self-hosted deployments.

🎯 Ready to Get Started?

Prerequisites:

  • • AWS account with Bedrock access
  • • AWS CLI configured
  • • Basic command line familiarity