A collaborative context board for sharing projects between ChatGPT, Claude, coding agents, and humans.
Dump is a visual canvas for collecting links, notes, checklists, and project briefs. Use it as a shared project context layer: one board your team can read, your AI tools can consume, and your future self can actually find again.
Live at dump.page
- Share a project between ChatGPT and Claude without rebuilding the same context in every chat
- Create a reusable project brief for Claude Code, Codex, Cursor, or other coding agents
- Replace scattered Slack threads and bookmark folders with one shared research board
- Keep multimodal context outside any single LLM vendor or chat history
- Expose board content through MCP and
llms.txtfor agent-friendly workflows
- Freeform canvas with drag-and-drop
- Link cards with automatic metadata extraction (title, favicon, preview image)
- Text notes with rich text editing
- Checklists
- Board sharing (private, shared via link, or public)
- Real-time collaboration via Convex
- MCP server for ChatGPT, Claude, Codex, Cursor, and other AI tool integrations
llms.txtendpoints for shared and public boards- Local-first mode — use without an account
Different people find Dump through different problems. These are all valid ways to describe the product:
- Shared project context for ChatGPT and Claude
- AI project handoff board
- MCP-ready context layer for agents
- Research hub for modern product and engineering teams
- Human-and-AI-readable whiteboard for links, notes, and decisions
- Framework: Next.js (App Router) + React 19
- Backend: Convex (serverless database + functions)
- Canvas: React Flow (@xyflow/react)
- Auth: Firebase (Google OAuth)
- Styling: TailwindCSS 4 + shadcn/ui
- Language: TypeScript
-
Clone the repo:
git clone https://github.com/Vochsel/dump.page.git cd dump -
Install dependencies:
bun install
-
Copy the environment template and fill in your values:
cp .env.example .env.local
-
Start the Convex dev server:
bun run deploy:convex:dev
-
Start the development server:
bun run dev
See .env.example for the full list. Key variables:
| Variable | Description |
|---|---|
NEXT_PUBLIC_CONVEX_URL |
Your Convex deployment URL |
NEXT_PUBLIC_FIREBASE_* |
Firebase project config values |
FIREBASE_PROJECT_ID |
Firebase project ID (used by Convex auth) |
SCRAPINGBEE_API_KEY |
Optional — for link metadata scraping |
| Command | Description |
|---|---|
bun run dev |
Start dev server (uses .env.local) |
bun run build |
Production build |
bun run lint |
Run ESLint |
bun run test |
Run tests |
bun run deploy:convex:dev |
Start Convex dev server |
bun run deploy:convex |
Deploy Convex to production |
Dump exposes an MCP server at https://www.dump.page/api/mcp so AI tools can read and write to your boards.
If you want the short version: Dump is a practical way to give ChatGPT, Claude, Claude Code, Codex, and similar tools the same project context instead of pasting links and notes into every conversation.
Shared and public boards can also be installed with vercel-labs/skills via npx skills add.
npx skills add https://www.dump.page/b/<board-slug>npx skills add https://www.dump.page/s/<share-token>/b/<board-slug>Dump serves a hosted SKILL.md for each board through the well-known skills endpoints. The installed skill is a snapshot of the board at install/update time. For live reads, search, and write access, use the Dump MCP server instead.
Claude Code
claude mcp add dump-mcp --transport http https://www.dump.page/api/mcpClaude Desktop / Claude.ai
Go to Settings → Integrations → Add More and paste:
https://www.dump.page/api/mcp
ChatGPT
Go to Settings → Connectors → Add connector and paste:
https://www.dump.page/api/mcp
Cursor
Go to Cursor Settings → MCP → Add new MCP Server. Use url type with:
https://www.dump.page/api/mcp
Or add to your .cursor/mcp.json:
{
"mcpServers": {
"dump-mcp": {
"url": "https://www.dump.page/api/mcp"
}
}
}VS Code
Add to your VS Code settings (JSON):
{
"mcp": {
"servers": {
"dump-mcp": {
"type": "http",
"url": "https://www.dump.page/api/mcp"
}
}
}
}Windsurf
Add to your ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"dump-mcp": {
"serverUrl": "https://www.dump.page/api/mcp"
}
}
}Codex
codex mcp add dump-mcp --transport http https://www.dump.page/api/mcpOr add to ~/.codex/config.toml:
[mcp_servers.dump-mcp]
type = "http"
url = "https://www.dump.page/api/mcp"- Blog: how to share project context between ChatGPT and Claude
- Blog: the AI project handoff board
- Blog: MCP is better with shared context
See CONTRIBUTING.md for guidelines.