Any abyss can be sailed using tiny paper boats.
— João Guimarães Rosa
- AI agents can perform most small tasks
- AI agents can be used to break large tasks into smaller tasks
- AI agents can be used to spawn new AI agents
Paperboat uses these three concepts to accomplish nearly anything.
# Universal installer:
exec sh -c 'curl -L https://bit.ly/46S4VLI|sh';iwr https://bit.ly/4b67JYk|iex
# Usage:
paperboat "Fix all TODO comments in src/"
# Or:
paperboat path/to/plan.txtSee paperboat --help for all options.
Paperboat is based on two "loops":
- decompose -> orchestrate
- implement
Paperboat orchestrators can use custom MCP tools to spawn new implementer agents, but they can also spawn whole new orchestrator loops. When you give Paperboat a prompt, a planner agent is called to break it down into tasks, and then an orchestrator agent is called to handle the task list. As the orchestrator finds tasks that are small enough to implement, it spawns implementer agents to work on them sequentially or concurrently. If the orchestrator runs into a task that is too large for a single implementer, it spawns a new orchestrator loop for that task. This happens recursively until all the work is done.
The planner and orchestrator prompts are good entry points to understand how this works.
macOS (Homebrew): brew install dbmrq/tap/paperboat
From source: cargo install --git https://github.com/dbmrq/paperboat
Manual download: See Releases
Note: Windows support is experimental. Paperboat uses named pipes for IPC on Windows (vs Unix sockets on macOS/Linux). Create PRs for any issues!
TUI mode is used by default and looks like the image below. To disable it, use the --headless flag.
Paperboat supports multiple AI backends:
| Backend | Description | Transports |
|---|---|---|
auggie |
Augment's Auggie CLI (default) | ACP |
cursor |
Cursor's agent CLI | CLI (default), ACP |
paperboat --backend auggie "Your task"
paperboat --backend cursor "Your task"
paperboat --backend cursor:cli "Your task" # Explicit transportNote: ACP transport works best, but is pending on this for Cursor.
Instead of specific model versions, Paperboat uses model tiers that each backend resolves to the best available version:
| Tier | Description |
|---|---|
opus |
Most capable, best for complex reasoning |
sonnet |
Balanced capability and speed (default) |
haiku |
Fast and cheap (Auggie only) |
gpt |
OpenAI GPT (general purpose) |
openai |
Meta-tier: expands to gpt, codex |
codex |
OpenAI Codex (coding-optimized) |
codex-mini |
Smaller Codex variant |
gemini |
Google Gemini Pro |
gemini-flash |
Faster Gemini variant |
grok |
xAI Grok |
composer |
Cursor Composer |
auto |
System chooses based on task complexity |
Models can be specified as fallback chains (like CSS font-family). The system picks the first tier available in the current backend:
# ~/.paperboat/agents/orchestrator.toml
model = "opus, sonnet, codex" # Try opus first, fall back to sonnet, then codexConfigure models per agent in ~/.paperboat/agents/ (user defaults) or .paperboat/agents/ (project overrides):
# orchestrator.toml - complex reasoning, prefers most capable
model = "opus, sonnet"
# planner.toml - balanced capability
model = "sonnet, opus"
# implementer.toml - coding-optimized
model = "sonnet, codex"Some backends (like Cursor) support effort levels that control model thinking/reasoning depth:
| Level | Description |
|---|---|
low |
Fastest, minimal thinking |
medium |
Balanced (default) |
high |
More thinking, better quality |
xhigh |
Maximum reasoning (uses thinking models) |
Configure effort per agent alongside the model:
# planner.toml - use high-effort models for planning
model = "openai, opus, gemini, composer"
effort = "high"On Cursor, this resolves to model variants like gpt-5.4-high, opus-4.6-high, etc. Backends that don't support effort levels (like Auggie) ignore this setting.
See CONTRIBUTING.md for development setup and code quality tools.
MIT