AI for builders

Your Unix terminal speaks English now.

echo "what you want" | onnai

No Python. No man pages. Just | and go.

Install in 10 seconds

curl -sSL onnai.ai/dropin | sh

Click to copy • Paste into Terminal

Trial: $10 to try. Plans from $9/mo.

Quick Start

# Charge your Onnai with $10 of credits onnai --dropin you@email.com
# Use onnai onnai "explain this error message"
# Pipe anything git diff | onnai "explain these changes" ps aux | onnai "explain like I am ten" ls -al | onnai "roast my directory"
# Add specialized Mimics onnai --add Coder cat buggy.py | Coder "find the bug"
# Chat mode - try a Mimic onnai -m Volcanara > hi
# Out of credits? --addfunds or subscribe onnai --addfunds

Why Onnai?

The name is the contract. You call Coder. We handle which model, which prompt, which version. Next month we upgrade what's behind it. Your scripts get smarter. You change nothing.

Compose like Unix. Chain Mimics together:

curl -sL news.ycombinator.com | onnai "top 5 stories" | onnai "explain like I'm 10"

Your files, your machine. Context lives in .context. Conversations in .chat. Edit them, search them, version them. No cloud lock-in.

Bring your own keys. Set OPENAI_API_KEY or ANTHROPIC_API_KEY and go. Same interface, your infrastructure. Or run local with onnai -p mlx.

Mimics

No prompt engineering. The name is the interface.

Coder

Reviews, refactors, fixes bugs, writes tests.

onnai --add Coder

Sys

Unix commands, scripts, system optimization.

onnai --add Sys

Writer

Docs, emails, READMEs. Matches your voice.

onnai --add Writer

Reviewer

Code review. Catches what you missed.

onnai --add Reviewer

Or make your own: drop a .persona file in any directory.

Examples

Code

# Explain unfamiliar code cat legacy.py | onnai "what does this do"
# Generate tests cat api.py | Coder "write pytest tests" > test_api.py
# Fix errors python script.py 2>&1 | onnai "fix this"

Git

# Explain changes git diff | onnai "summarize"
# Write commit message git diff --cached | onnai "commit message"
# Code review git show HEAD | Reviewer

System

# Diagnose issues ps aux | onnai "anything suspicious?"
# Explain logs tail -100 /var/log/syslog | onnai "any errors?"
# Get the command you need onnai "find files larger than 1GB modified this week"

Clipboard workflow

# Copy → transform → paste pbpaste | onnai "fix grammar" | pbcopy pbpaste | onnai "make concise" | pbcopy pbpaste | onnai "to bullet points" | pbcopy

For Developers

Git hooks

# .git/hooks/pre-commit git diff --cached | Reviewer || exit 1

Log monitoring

tail -f app.log | grep ERROR | onnai "diagnose"

Documentation

find . -name "*.py" | xargs cat | onnai "generate API docs" > API.md

Project context

# .context file in project root echo "Python 3.11, FastAPI, PostgreSQL. Follow PEP 8." > .context
# AI now knows your stack cat endpoint.py | Coder "add error handling"

Compare providers

# Same prompt, different models — no accounts needed onnai -m openai/gpt-5.1 "explain quicksort" | onnai "grade 1-10, explain your reasoning" onnai -m anthropic/claude-opus-4-5-2025110 "explain quicksort" | onnai "grade 1-10, explain your reasoning" onnai -m cerebras/zai-glm-4.6 "explain quicksort" | onnai "grade 1-10, explain your reasoning" onnai -p ollama -m minimax-m2.1:cloud "explain quicksort" | onnai "grade 1-10, explain your reasoning"

Technical

Dual-process design: onnai CLI + onnaid daemon.

onnai (CLI)

Resolves symlinks via argv[0]

coder → onnai

Walks tree for .context / .persona

Unix socket to daemon

onnaid (Daemon)

Event-driven C, non-blocking

kqueue / epoll

Persistent HTTP/2 pool

Zero cold starts

BYOK Providers

Set your API key and go. Traffic never touches onnai.ai.

Cloud

OpenAI · Anthropic · Groq · Cerebras · Together · Mistral · X.AI

Local

MLX · Ollama

No keys required

# Use any provider onnai -p anthropic "your prompt" onnai -p mlx -m llama3 "your prompt"
# List available models onnai -p cerebras -l

Plans

Use with BYOK. Subscribe for managed AI. Use any provider with unified billing.

Creator

$9/mo
  • Pick any Mimic
  • Create your own Mimic
  • Top off when out of Mimic credits
  • Hobbyist/Enthusiast Mimic Usage Included
  • BYOK Provider Abstraction enables model swap
  • BYOK Inference never touches onnai.ai
  • Community support

Max

$119+/mo
  • Everything in Pro
  • Always-on Onnai Machine with SSH access
  • Server-based onnai -p mlx
  • Scale Compute/Storage/Inference (Pay As You Go) for your API
  • Priority support

Custom

$300+/mo
  • Usage analytics, admin, logging
  • On-Premise / Private
  • Forward-deployed engineers
  • Seminars & Training
Contact Sales

All plans include unlimited BYOK usage. Cancel anytime.

Need More?

Private cloud deployment, custom model fine-tuning, or dedicated support? Let's talk.