Index your entire codebase, git history, PRs, tickets and discussions into a knowledge graph β then chat with it in plain English.
Every developer has felt this: you join a project (or return to your own after 6 months) and spend days just figuring out how code connects, why decisions were made, what breaks if you change something.
CodeMind fixes that. One command indexes your repo into a queryable knowledge graph β then you just ask it.
$ codemind chat
> Why does the auth middleware skip OPTIONS requests?
> Which files would break if I rename UserService?
> Who wrote the payment module and what PR introduced it?
| Feature | Description |
|---|---|
| πΊοΈ Living architecture map | Auto-generated graph of your codebase β files, functions, classes and their relationships |
| π¬ Chat with your code | Natural language Q&A powered by a free local LLM (no API keys needed) |
| π Code + context linked | Connects code to the PRs, Jira tickets, and Slack threads that explain why it exists |
| π Impact predictor | Ask "what breaks if I change X?" before touching anything |
| π₯ Onboarding paths | Auto-generate "start here" guides for new team members |
| π¨ Drift detection | Alerts when the codebase diverges from its documented architecture |
- Python 3.9+
- Docker + Docker Compose β runs Neo4j and Qdrant
- Ollama β free local LLM runtime
# 1. Clone
git clone https://github.com/vineethwilson15/codemind.git
cd codemind
# 2. Start databases (Neo4j + Qdrant)
docker compose up -d
# 3. Pull the LLM (free, runs locally β no API key needed)
ollama pull codellama
# 4. Install Python dependencies
pip install -r requirements.txt
# 5. Configure environment
cp .env.example .env # defaults work out of the box# Index any codebase (yours, an OSS project, anything)
python -m cli index --repo /path/to/your/project
# Start chatting
python -m cli chatCodeMind is pluggable β connect only what your team uses.
| Category | Supported |
|---|---|
| Git Hosts | GitHub Β· GitLab Β· Bitbucket Β· Azure Repos |
| Issue Trackers | Jira Β· Azure Boards Β· Linear Β· GitHub Issues |
| Chat / Discussions | Slack Β· Microsoft Teams |
| LLM | Ollama (default, free, local) Β· OpenAI Β· Anthropic |
Configure via .env β see .env.example for all options.
Your Codebase + Git History
β
βΌ
βββββββββββββββββββββ
β AST Parser β tree-sitter β extracts functions,
β (core/ast_parsers)β classes, imports per language
ββββββββββ¬βββββββββββ
β
βΌ
βββββββββββββββββ ββββββββββββββββββββ
β Neo4j Graph ββββββββΊβ Qdrant Vectors β
β knowledge β β semantic search β
β graph β β (local embeddingsβ
βββββββββ¬ββββββββ ββββββββββββββββββββ
β
βΌ
βββββββββββββββββ
β Ollama LLM β free, local β codellama by default
β (core/chat) β
βββββββββ¬ββββββββ
β
βΌ
ββββββββββββββββββββββββββββ
β CLI Β· API Β· VS Code β
ββββββββββββββββββββββββββββ
By default, everything runs 100% locally β no data leaves your machine. Enabling cloud LLMs (OpenAI, Anthropic) or external integrations (GitHub, Jira, Slack) will send the relevant data to those providers.
- Phase 1 β Core Engine: repo indexer, knowledge graph, local LLM chat, CLI
- Phase 2 β Integrations: GitHub/GitLab, Jira/Azure Boards, Slack/Teams
- Phase 3 β UI & Extension: VS Code extension, web UI with live architecture map
Contributions are welcome and appreciated!
- π Report a bug
- π‘ Request a feature
- π¬ Ask a question
- π Read CONTRIBUTING.md
# Run unit tests (no Docker needed)
python -m pytest tests/unit/
# Run integration tests (requires docker compose up -d)
python -m pytest tests/integration/MIT β see LICENSE