Long-term memory and multi-agent orchestration for Claude Code.
Hive is a Claude Code plugin that gives Claude two superpowers it doesn't have by default:
- Persistent memory across all projects and sessions — stored locally in SQLite, never sent anywhere.
- Specialist agents (Planner, Coder, Debugger, Reviewer) running in parallel, coordinated by an orchestrator that loads relevant memories before dispatching work.
| Problem | Hive's solution |
|---|---|
| Claude forgets everything between sessions | SQLite FTS5 memory, survives restarts |
| One agent does everything mediocrely | Specialists: plan → code → debug → review |
| Memory systems need Chroma, LangChain, etc. | Zero dependencies: just sqlite3 (built into macOS) |
| Compression costs tokens over time | claude-haiku compresses old memories cheaply |
| Memory leaks across unrelated projects | Per-project scoping + global fallback |
git clone https://github.com/avinashcheby/Hive-Claude-Code-Skill
cd Hive-Claude-Code-Skill
bash install.shinstall.sh:
- Copies scripts to
~/.hive/scripts/ - Copies skills to
~/.claude/skills/hive*/(auto-discovered by Claude Code) - Initializes the SQLite database at
~/.hive/memory.db - Cleans up any previous plugin-registry install
Then restart Claude Code. The /hive skills will be available automatically — no flags needed.
Requirements: macOS (sqlite3 built-in) or Linux with sqlite3 + FTS5. No Python, Node, or npm.
The main command. Give it any task and Hive will:
- Load relevant memories from past sessions
- Route to appropriate specialist agents (in parallel)
- Synthesize the results
- Save learnings back to memory
/hive build a REST API for user authentication with JWT
/hive refactor the database layer to use connection pooling
/hive the login endpoint returns 500 when the email has a + sign
Browse and manage your memory store.
/hive-memory search postgres
/hive-memory list error
/hive-memory delete 42
Show memory counts, recent sessions, DB size, and compression history.
Compress old, low-importance memories using claude-haiku. Run this after large sessions to keep the DB lean.
Memories are stored in ~/.hive/memory.db (SQLite with FTS5 full-text search).
Five memory types:
| Type | Example |
|---|---|
fact |
"Project uses PostgreSQL 15 with pgvector" |
decision |
"Chose Redis over Memcached because TTL semantics are simpler" |
pattern |
"All API handlers return (Response, StatusCode), never panic" |
error |
"SQLite 'database is locked' when WAL mode is off" |
preference |
"Prefers table-driven tests, dislikes mocks" |
Ranking formula:
score = (importance/10) × (1 / (1 + days_since_access)) × (1 + access_count)
High importance + recent + frequently used = top of recall results.
Compression: Memories with importance ≤ 3, never accessed, older than 30 days are summarized by claude-haiku into a single pattern entry. Tune with env vars:
export HIVE_COMPRESS_AGE=30 # days threshold
export HIVE_COMPRESS_MAX_IMPORTANCE=3 # max importance to compress
export HIVE_COMPRESS_BATCH=20 # memories per batchHive has four specialist agents:
| Agent | Role | When used |
|---|---|---|
| Planner | Decomposes task → steps, files, risks | Multi-step tasks, architecture |
| Coder | Implements code + tests | Any implementation work |
| Debugger | Root cause analysis + minimal fix | Error messages, test failures |
| Reviewer | Security, correctness, convention | Code touching APIs, auth, storage |
The orchestrator (/hive) decides which agents to launch based on the task, launches them in parallel, then synthesizes the results. It never launches agents whose output it won't use.
| Env var | Default | Description |
|---|---|---|
HIVE_DB |
~/.hive/memory.db |
Override DB location |
HIVE_HOME |
~/.hive |
Override DB directory |
HIVE_COMPRESS_AGE |
30 |
Days before compression eligibility |
HIVE_COMPRESS_MAX_IMPORTANCE |
3 |
Max importance level to compress |
HIVE_COMPRESS_BATCH |
20 |
Memories per compression batch |
bash tests/run_all.shTests use isolated temp DBs and do not make network calls.
See CLAUDE.md for contributor conventions.
Key rules:
- No Python/Node dependencies — bash + sqlite3 only
- Bash 3.2 compatible (macOS built-in)
set -euo pipefailon every script- Max 50 lines per function, 500 lines per file
- Every script in
scripts/gets a test intests/
MIT