OnglX Deploy CLI Documentation
Deploy AI inference APIs to AWS Bedrock using our command-line tool. Simple, focused, and production-ready infrastructure automation.
AWS Bedrock Support
Quick Start Example
Terminal - OnglX Deploy
# Initialize and deploy an AI API in 4 commands$ onglx-deploy init --host aws --region us-east-1$ onglx-deploy add inference --component api --type openai$ onglx-deploy plan --show-costs$ onglx-deploy deploy
CLI Commands
init
Initialize new deployment
Set up cloud credentials and create deploy.yml configuration
add
Add components
Add inference API or UI components to your deployment
plan
Preview changes
Show deployment plan with cost estimation and targeting
deploy
Deploy infrastructure
Deploy with automatic rollback and signal handling
status
Check status
Show deployment status and resource health
logs
View logs
View component logs with filtering and following
destroy
Remove resources
Safely remove all deployed resources with confirmation
More Commands
Additional utilities
Completion, help, and configuration management