A multi-provider terminal coding agent.
Fork of OpenAI Codex CLI with Anthropic Claude support, multi-provider auth, extended thinking display, and more.
OpenAI Codex CLI is a great terminal coding agent — but it only works with OpenAI models. We forked it to build Orbit, an AI-native development environment, and needed the terminal agent to work with Anthropic Claude as a first-class provider.
After months of working with Codex as our engine layer, we've decided to build our own from scratch. We tried Codex, OpenCode, and similar projects — they all have fundamental gaps in provider abstraction, approval pipelines, and extensibility that would require a full rewrite to fill properly. Rather than keep patching, we're starting fresh.
Orbit is still shipping. We're open-sourcing this fork because we're no longer using it as our engine, and the work we did here — Claude support, multi-provider auth, thinking display — shouldn't go to waste. Take it, use it, build on it.
Looking for Orbit itself? Download the desktop app.
These are the features we built from scratch on top of the upstream Codex codebase:
Not a wrapper or proxy — a proper provider integration with its own crate (codex-rs/anthropic/), model catalog, and API client. Claude models are resolved, displayed, and selectable in the TUI just like OpenAI models.
Switch between OpenAI and Anthropic from inside the TUI. No restart needed.
/authcommand opens a provider picker with inline auth flows/modelcommand shows models from the active provider- Supports API key entry, OAuth login, and device code flows
- Auth state persisted per-provider — switch back and forth without re-authenticating
A catalog-driven architecture that mirrors what Codex has for GPT models, but for Claude:
- Model IDs resolved and validated against the Anthropic catalog
- Context window sizes, capability flags, and pricing pulled from metadata
- Provider-specific reasoning level labels — "Max" for Claude, "Extra High" for OpenAI
Live streaming of Claude's thinking tokens directly in the TUI:
- Thinking content renders in italic magenta, visually distinct from response text
- Summary vs. raw state fully separated in the protocol
- Replay and finalization handle all edge cases (interrupts, tool calls mid-thought)
- Requests
reasoning.contentfrom the Responses API so thinking tokens actually reach the TUI
When spawning sub-agents, you're prompted to choose the model and reasoning level — rather than inheriting the parent's config blindly.
request_user_input works in all collaboration modes, not just the default. Sub-agents can ask clarifying questions regardless of the approval pipeline configuration.
Replaced .codex with .orbit for config directories and environment variables (ORBIT_HOME). Legacy .codex fallback was cleanly removed.
We merged 292 commits from upstream Codex (0.118.0), resolving 684 conflict files across a 6-phase sync. The approach was add-only — we never removed our fork's logic during the merge.
All the original Codex features are here too:
- Rich Terminal UI — Ratatui-based TUI with syntax highlighting, tool call visualization, and responsive layout
- Sandboxed Execution — Platform-specific sandboxes (Seatbelt on macOS, Landlock/seccomp on Linux)
- MCP Support — Model Context Protocol server for IDE integrations
- App Server — JSON-RPC WebSocket API for programmatic access
- Session Management — Persistent sessions with SQLite-backed conversation history
- Hooks System — Lifecycle hooks for customizing agent behavior
- Multi-Agent — Spawn and manage multiple agent instances
- Skills Framework — Extensible skill system for custom agent behaviors
- Python & TypeScript SDKs — Programmatic access from your language of choice
- File Operations — Read, write, search, and patch files with approval workflows
- Git Integration — Built-in git operations and diff handling
# Clone
git clone https://github.com/Recusive/Orbit-Code.git
cd Orbit-Code
# Install dependencies
cd codex-rs && cargo fetch
# Run from source
just codex
# Run tests
just test
# Format & lint
just fmt
just fix- Rust 1.93+ (pinned in
codex-rs/rust-toolchain.toml) - Node.js 22+ and pnpm 10+ (for npm wrapper and TypeScript SDK)
justcommand runnercargo-nextest(recommended for faster tests)
┌─────────────────────────────────────────────────────────────┐
│ CONSUMER LAYER │
│ tui Terminal UI (Ratatui) │
│ tui_app_server TUI variant for IDE integration │
│ cli Binary entry point │
│ app-server JSON-RPC WebSocket API │
│ mcp-server MCP protocol server │
└───────────────────────────┬─────────────────────────────────┘
│
┌───────────────────────────▼─────────────────────────────────┐
│ ENGINE LAYER │
│ core Agent loop, tool execution, config │
│ anthropic Anthropic API client (our addition) │
│ app-server-protocol JSON-RPC types (v1 + v2) │
└───────────────────────────┬─────────────────────────────────┘
│
┌───────────────────────────▼─────────────────────────────────┐
│ FOUNDATION LAYER │
│ protocol Op, EventMsg, SandboxPolicy, etc. │
│ config TOML config parsing and layer merging │
│ hooks Lifecycle hook execution engine │
│ login OAuth/auth login flows (multi-provider) │
│ secrets Encrypted secrets (keyring backend) │
│ utils/* ~20 utility crates │
└─────────────────────────────────────────────────────────────┘
67+ Rust crates. The workspace is in codex-rs/. All just commands run from there.
Orbit-Code/
├── codex-rs/ # Primary Rust codebase (67+ crates)
│ ├── anthropic/ # ★ Anthropic Claude API client
│ ├── login/ # ★ Multi-provider auth (OAuth, API key, device code)
│ ├── core/ # Agent engine, tools, config
│ ├── tui/ # Terminal UI (Ratatui)
│ ├── protocol/ # Message types and prompts
│ ├── cli/ # Binary entry point
│ ├── app-server/ # JSON-RPC WebSocket API
│ ├── mcp-server/ # MCP protocol server
│ ├── hooks/ # Lifecycle hook system
│ ├── config/ # TOML config system
│ ├── state/ # SQLite session persistence
│ └── utils/ # ~20 utility crates
├── sdk/
│ ├── python/ # Python SDK
│ └── typescript/ # TypeScript SDK
├── shell-tool-mcp/ # Shell tool MCP server
├── codex-cli/ # npm package wrapper
├── docs/ # Documentation
└── scripts/ # Build & install scripts
Items marked with ★ are our additions to the upstream codebase.
| Command | Description |
|---|---|
just codex |
Run Orbit Code from source |
just test |
Run all Rust tests (nextest) |
just fmt |
Format Rust code |
just fix |
Run clippy fixes |
just mcp-server-run |
Run the MCP server |
just write-config-schema |
Regenerate config schema |
just write-app-server-schema |
Regenerate app-server schema |
This is a fork of openai/codex. We track upstream and periodically sync (last sync: 0.118.0, 292 commits merged). Our changes are additive — we don't remove upstream functionality.
If you're looking for the OpenAI-only version, use the upstream repo directly.
We welcome contributions. This is now a community project.
- Bug fixes and improvements — PRs welcome
- New provider integrations — The multi-provider architecture is designed for this
- Hooks and plugins — The hooks system is partially built; help us finish it
Please see CONTRIBUTING.md for development guidelines.
Apache License 2.0. See LICENSE for details.
Upstream Codex code is also Apache 2.0 licensed.
Built by Recursive Labs, the team behind Orbit.
We moved to a custom engine. This fork is now open source — use it, extend it, make it yours.