zai-coding-plan/glm-4.5 zai-coding-plan/glm-4.5-air zai-coding-plan/glm-4.5v zai-coding-plan/glm-4.6 opencode/big-pickle opencode/grok-code anthropic/claude-opus-4-0 anthropic/claude-3-5-sonnet-20241022 anthropic/claude-opus-4-1 anthropic/claude-haiku-4-5 anthropic/claude-3-5-sonnet-20240620 anthropic/claude-3-5-haiku-latest anthropic/claude-3-opus-20240229 anthropic/claude-sonnet-4-5 anthropic/claude-sonnet-4-5-20250929 anthropic/claude-sonnet-4-20250514 anthropic/claude-opus-4-20250514 anthropic/claude-3-5-haiku-20241022 anthropic/claude-3-haiku-20240307 anthropic/claude-3-7-sonnet-20250219 anthropic/claude-3-7-sonnet-latest anthropic/claude-sonnet-4-0 anthropic/claude-opus-4-1-20250805 anthropic/claude-3-sonnet-20240229 anthropic/claude-haiku-4-5-20251001 openai/gpt-4.1-nano openai/text-embedding-3-small openai/gpt-4 openai/o1-pro openai/gpt-4o-2024-05-13 openai/gpt-4o-2024-08-06 openai/gpt-4.1-mini openai/o3-deep-research openai/gpt-3.5-turbo openai/text-embedding-3-large openai/gpt-4-turbo openai/o1-preview openai/o3-mini openai/codex-mini-latest openai/gpt-5-nano openai/gpt-5-codex openai/gpt-4o openai/gpt-4.1 openai/o4-mini openai/o1 openai/gpt-5-mini openai/o1-mini openai/text-embedding-ada-002 openai/o3-pro openai/gpt-4o-2024-11-20 openai/o3 openai/o4-mini-deep-research openai/gpt-4o-mini openai/gpt-5 openai/gpt-5-pro to list available models\n Or check your config (opencode.json) provider/model names\n\n2) Dot vs dash (punctuation normalization)\n $ bun run ./src/index.ts run --model anthropic/claude-haiku-4.5 "hi"\n Error: Model not found: anthropic/claude-haiku-4.5\n Did you mean: anthropic/claude-haiku-4-5, anthropic/claude-haiku-4-5-20251001\n Try: zai-coding-plan/glm-4.5-flash zai-coding-plan/glm-4.5 zai-coding-plan/glm-4.5-air zai-coding-plan/glm-4.5v zai-coding-plan/glm-4.6 opencode/big-pickle opencode/grok-code anthropic/claude-opus-4-0 anthropic/claude-3-5-sonnet-20241022 anthropic/claude-opus-4-1 anthropic/claude-haiku-4-5 anthropic/claude-3-5-sonnet-20240620 anthropic/claude-3-5-haiku-latest anthropic/claude-3-opus-20240229 anthropic/claude-sonnet-4-5 anthropic/claude-sonnet-4-5-20250929 anthropic/claude-sonnet-4-20250514 anthropic/claude-opus-4-20250514 anthropic/claude-3-5-haiku-20241022 anthropic/claude-3-haiku-20240307 anthropic/claude-3-7-sonnet-20250219 anthropic/claude-3-7-sonnet-latest anthropic/claude-sonnet-4-0 anthropic/claude-opus-4-1-20250805 anthropic/claude-3-sonnet-20240229 anthropic/claude-haiku-4-5-20251001 openai/gpt-4.1-nano openai/text-embedding-3-small openai/gpt-4 openai/o1-pro openai/gpt-4o-2024-05-13 openai/gpt-4o-2024-08-06 openai/gpt-4.1-mini openai/o3-deep-research openai/gpt-3.5-turbo openai/text-embedding-3-large openai/gpt-4-turbo openai/o1-preview openai/o3-mini openai/codex-mini-latest openai/gpt-5-nano openai/gpt-5-codex openai/gpt-4o openai/gpt-4.1 openai/o4-mini openai/o1 openai/gpt-5-mini openai/o1-mini openai/text-embedding-ada-002 openai/o3-pro openai/gpt-4o-2024-11-20 openai/o3 openai/o4-mini-deep-research openai/gpt-4o-mini openai/gpt-5 openai/gpt-5-pro to list available models\n Or check your config (opencode.json) provider/model names\n\n3) Missing provider (model-only input)\n $ bun run ./src/index.ts run --model big-pickle "hi"\n Error: Model not found: big-pickle/\n Did you mean: opencode/big-pickle\n\n4) Correct model after suggestion\n $ bun run ./src/index.ts run --model opencode/big-pickle "hi"\n Hi! How can I help you with your opencode project today?\n\nNotes\n- Suggestions are hints only; behavior is unchanged (no auto-selection).\n- This runs locally as part of the CLI error path; performance impact is negligible (small in-memory scans). |
||
|---|---|---|
| .github | ||
| .husky | ||
| .opencode | ||
| github | ||
| infra | ||
| logs | ||
| packages | ||
| patches | ||
| script | ||
| sdks/vscode | ||
| specs | ||
| .editorconfig | ||
| .gitignore | ||
| AGENTS.md | ||
| CONTRIBUTING.md | ||
| LICENSE | ||
| README.md | ||
| STATS.md | ||
| bun.lock | ||
| bunfig.toml | ||
| install | ||
| package.json | ||
| sst-env.d.ts | ||
| sst.config.ts | ||
| tsconfig.json | ||
| turbo.json | ||
README.md
The AI coding agent built for the terminal.
Installation
# YOLO
curl -fsSL https://opencode.ai/install | bash
# Package managers
npm i -g opencode-ai@latest # or bun/pnpm/yarn
scoop bucket add extras; scoop install extras/opencode # Windows
choco install opencode # Windows
brew install opencode # macOS and Linux
paru -S opencode-bin # Arch Linux
[!TIP] Remove versions older than 0.1.x before installing.
Installation Directory
The install script respects the following priority order for the installation path:
$OPENCODE_INSTALL_DIR- Custom installation directory$XDG_BIN_DIR- XDG Base Directory Specification compliant path$HOME/bin- Standard user binary directory (if exists or can be created)$HOME/.opencode/bin- Default fallback
# Examples
OPENCODE_INSTALL_DIR=/usr/local/bin curl -fsSL https://opencode.ai/install | bash
XDG_BIN_DIR=$HOME/.local/bin curl -fsSL https://opencode.ai/install | bash
Documentation
For more info on how to configure OpenCode head over to our docs.
Contributing
If you're interested in contributing to OpenCode, please read our contributing docs before submitting a pull request.
FAQ
How is this different than Claude Code?
It's very similar to Claude Code in terms of capability. Here are the key differences:
- 100% open source
- Not coupled to any provider. Although Anthropic is recommended, OpenCode can be used with OpenAI, Google or even local models. As models evolve the gaps between them will close and pricing will drop so being provider-agnostic is important.
- Out of the box LSP support
- A focus on TUI. OpenCode is built by neovim users and the creators of terminal.shop; we are going to push the limits of what's possible in the terminal.
- A client/server architecture. This for example can allow OpenCode to run on your computer, while you can drive it remotely from a mobile app. Meaning that the TUI frontend is just one of the possible clients.
What's the other repo?
The other confusingly named repo has no relation to this one. You can read the story behind it here.
