CLI Reference - Actual Command Guide

Available Commands

OnglX Deploy currently implements these basic commands:

Implemented Commands

Basic Project Commands

onglx-deploy init
Initialize new project with deploy.yml configuration

Creates deploy.yml configuration file in current directory.

BASH
1# Initialize with AWS (default)
2onglx-deploy init --host aws
3
4# Initialize with GCP (beta)
5onglx-deploy init --host gcp
6
7# Initialize for multi-cloud (defaults to AWS)
8onglx-deploy init --host multi

Options:

  • --host HOST - Cloud provider (aws, gcp, multi) - REQUIRED

Basic Workflow:

  1. onglx-deploy init --host aws - Initialize project
  2. onglx-deploy add inference --component api --type openai - Add components
  3. onglx-deploy plan - Review deployment plan
  4. onglx-deploy deploy - Deploy resources
onglx-deploy add inference
Add inference components to deploy.yml

Adds AI inference API or UI components to deployment configuration.

BASH
# Add OpenAI-compatible API
onglx-deploy add inference --component api --type openai --model anthropic.claude-3-5-sonnet

# Add OpenWebUI interface
onglx-deploy add inference --component ui --type openwebui --size small

Options:

  • --component - Component type (api or ui) - REQUIRED
  • --type - Implementation type (openai or openwebui) - REQUIRED
  • --model - Model name for API components
  • --size - Instance size for UI components (small, medium, large)
  • --region - Override region for this component
onglx-deploy plan
Show deployment plan without making changes

Performs a dry-run using Terraform to show what resources would be created.

BASH
# Show what would be deployed
onglx-deploy plan

Shows:

  • Resources to be created/modified/destroyed
  • Terraform execution plan
  • No actual changes made
onglx-deploy deploy
Deploy infrastructure to cloud provider

Apply the deployment configuration using Terraform to create cloud resources.

BASH
# Deploy with confirmation prompt
onglx-deploy deploy

# Deploy without prompts
onglx-deploy deploy --auto-approve

Options:

  • --auto-approve - Skip interactive approval
  • --max-parallel - Maximum parallel operations (default: 4)
onglx-deploy status
Show deployment status and resource health

Display current status of deployed resources and health checks.

BASH
# Show deployment status
onglx-deploy status

# Verbose status information
onglx-deploy status --verbose

Shows:

  • Component health status
  • Last deployment information
  • Active endpoints
onglx-deploy logs
View logs from deployed components

Display logs from Lambda functions and other deployed components.

BASH
1# View recent logs (last 100 lines)
2onglx-deploy logs
3
4# Follow logs in real-time
5onglx-deploy logs --follow --tail 50
6
7# Show logs from specific time
8onglx-deploy logs --since 1h
9
10# Filter by component
11onglx-deploy logs api

Options:

  • --tail - Number of lines to show (default: 100)
  • --follow, -f - Follow log output in real-time
  • --since - Show logs since duration (e.g., 1h, 30m)
onglx-deploy destroy
Destroy all deployed infrastructure

Remove all resources created by deployment. This action cannot be undone.

BASH
# Destroy with confirmation
onglx-deploy destroy

# Destroy without prompts
onglx-deploy destroy --auto-approve

Options:

  • --auto-approve - Skip interactive approval

Warning: This permanently deletes all cloud resources!

What's Not Implemented Yet

Project Management Commands (Not Implemented):

  • projects list/create/remove/info
  • get projects/staging
  • describe project/inference
  • remove inference/database
  • commit (staging workflow)

Multi-Cloud Commands (Not Implemented):

  • cloud compare/deploy/recommend/status/optimize
  • All multi-cloud functionality

Advanced Features (Not Implemented):

  • cost and cost analysis
  • doctor (diagnostics)
  • auth (authentication)
  • examples/tutorial/tier/domains
  • validate/clean/fix utilities

Basic Example Workflow

BASH
1# 1. Initialize project
2onglx-deploy init --host aws
3
4# 2. Add inference API component
5onglx-deploy add inference --component api --type openai
6
7# 3. Review deployment plan
8onglx-deploy plan
9
10# 4. Deploy to AWS
11onglx-deploy deploy
12
13# 5. Check status
14onglx-deploy status
15
16# 6. View logs
17onglx-deploy logs --follow