OpenCode
Open-source terminal coding agent, running on Omnizen models.
1. Get an API key
Go to your dashboard → API keys, click Create API key, and copy the sk-… token. You only see it once — save it somewhere safe.
2. Install OpenCode
Easiest path on macOS, Linux (glibc), or Windows:
npm install -g opencode-aiSee the official OpenCode docs for native installers and shell completions.
Heads-up for Alpine / musl users: OpenCode ships a glibc binary. Use Ubuntu, Debian, or any glibc-based image instead.
3. Configure Omnizen as a provider
Create or edit ~/.config/opencode/config.json (Windows: %APPDATA%\opencode\config.json):
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"omnizen": {
"npm": "@ai-sdk/openai-compatible",
"name": "Omnizen",
"options": {
"baseURL": "https://api.omnizen.ai/v1",
"apiKey": "sk-…paste-your-omnizen-key-here"
},
"models": {
"deepseek-flash": { "name": "DeepSeek Flash" },
"deepseek-pro": { "name": "DeepSeek Pro (reasoning)" },
"kimi-k2.6": { "name": "Kimi K2.6 (262K context)" },
"claude-3-5-haiku-20241022": { "name": "Claude 3.5 Haiku alias" },
"claude-3-5-sonnet-20241022": { "name": "Claude 3.5 Sonnet alias" }
}
}
}
}Add or remove models as you like — anything in your Omnizen catalog works. The provider is OpenAI-compatible, so OpenCode talks to https://api.omnizen.ai/v1/chat/completions with your key as a Bearer token.
4. Smoke-test it
opencode run --model omnizen/deepseek-flash "what is 2+2?"You should get back something like:
> build · deepseek-flash
4Try Kimi too:
opencode run --model omnizen/kimi-k2.6 "what is 7*8? reply with only the number"5. Open the full TUI
opencodeInside, hit /models to switch between any models you defined in the config. Tokens stream live as the model generates them — every call is also logged on your dashboard.
Tips
- Reasoning models need bigger budgets. Models like
kimi-k2.6internally consume tokens to "think" before replying — set--max-tokenshigh enough (e.g. 1500+) or the visible reply may be empty. - Per-project config. Drop a project-local
opencode.jsonat your repo root to scope a single codebase to a dedicated Omnizen key. The file format is identical. - Streaming + tool calling work. No special flags — OpenCode's agent flows (file edits, shell exec, multi-step plans) all run through Omnizen the same way they would against OpenAI.
- Same key, multiple tools. The exact same
sk-…token works for Cline, Codex CLI, Aider, Continue, Cursor, and any OpenAI-compatible client.