Skip to content

ayushptl1810/ConstitutionAgent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Samvidha: Constitutional Intelligence Agent

Samvidha is a state-of-the-art GraphRAG (Graph Retrieval-Augmented Generation) platform designed for high-precision constitutional legal research. It transforms the Indian Constitution into a dynamic, queryable intelligence network, allowing users to trace the evolution of laws across decades of amendments.


🏗️ System Architecture

Samvidha uses a multi-agent orchestration layer powered by LangGraph, combining the structural precision of a Graph Database with the semantic depth of Vector Search.

The Intelligence Loop (LangGraph)

  1. Intent Classification: Decomposes user queries into constitutional entities (Articles, Amendments, Parts).
  2. Parallel Retrieval:
    • Graph Routing (Neo4j): Executes multi-hop Cypher queries to find cross-references (e.g., "Article 358 suspends Article 19").
    • Semantic Search (Qdrant): Fetches raw text chunks using dense vector embeddings.
  3. Hybrid Reranking: Uses a BGE-Reranker model to score and fuse results from both graph and vector paths.
  4. Temporal Organization: Chronologically sorts modifications to ensure the "Hierarchy of Truth" favors the latest amendments.
  5. Legal Reasoning: A specialized LLM synthesizes the context into a formal legal opinion with strict citation requirements.
  6. Quality Assurance: A critique agent validates the answer against the retrieved facts, triggering a recursive "Refine" loop if gaps are detected.

⚡ Real-Time Intelligence Streaming: The entire workflow is served via Server-Sent Events (SSE). The frontend terminal streams execution traces live as agents complete their tasks, providing zero-latency feedback before the final opinion is even finished.


🛠️ Technology Stack

Backend (Python / FastAPI)

  • Graph Database: Neo4j (Local Docker or Neo4j Aura Cloud).
  • Vector Database: Qdrant (Local Docker or Qdrant Cloud).
  • Orchestration: LangGraph (for complex node-based agent workflows).
  • Models:
    • Reasoning: Llama 3.3 70B via Groq (Ultra-fast inference).
    • Reranking: BAAI/bge-reranker-base (Cross-Encoder).
    • Embeddings: HuggingFace/gte-large.

Frontend (React / Vite)

  • 3D Visualization: react-force-graph-3d (Three.js) for interactive constitutional mapping.
  • State Management: Zustand (real-time store for search results and execution traces).
  • Styling: Vanilla CSS with a "Classified Dossier" aesthetic.

📂 Project Structure

├── agent/
│   ├── core/              # LangGraph workflow definitions & Ingestion logic
│   ├── services/          # Core intelligence modules (Neo4j, Qdrant, Reasoner)
│   ├── utils/             # Configuration & system settings
│   ├── .env.template      # Environment configuration template
│   └── main.py            # FastAPI entry point & Docker lifecycle management
├── frontend/
│   ├── src/
│   │   ├── components/    # 3D Graph, Legal Dossier, System Terminal
│   │   └── store/         # Zustand global state (LexStore)
├── Dockerfile.neo4j       # Custom Neo4j build with data pre-loading
├── Dockerfile.qdrant      # Custom Qdrant build with data pre-loading
├── neo4j_data/            # Persistent graph storage
└── qdrant_storage/        # Persistent vector storage

🚀 Getting Started

Prerequisites

  • Python 3.10+
  • Node.js 20+
  • Docker (required for automated database lifecycle)
  • Groq API Key

Backend Setup (Local & Cloud)

  1. Navigate to the agent directory:
    cd agent
  2. Create and activate a virtual environment:
    python -m venv .venv
    source .venv/bin/activate
  3. Install dependencies:
    pip install -r requirements.txt
  4. Configure environment variables: Samvidha supports Hybrid Cloud Mode. Copy the template to get started:
    cp .env.template .env
    Then, fill in your GROQ_API_KEY. To use cloud databases, provide NEO4J_URI and QDRANT_ENDPOINT; otherwise, the system defaults to local Docker.
  5. Launch the API:
    uvicorn main:app --reload
    Note: The API will automatically start Neo4j and Qdrant Docker containers on startup.

Deployment to Hugging Face Spaces

The agent is pre-configured for deployment as a Docker Space on Hugging Face:

  1. Create a new Space on Hugging Face and select Docker as the SDK.
  2. Set the Repository Structure to point to the agent/ directory or push only the agent/ folder to the space.
  3. Add your secrets (from .env) to the Hugging Face Space Secrets settings:
    • GROQ_API_KEY, NEO4J_URI, NEO4J_PASSWORD, QDRANT_ENDPOINT, etc.
  4. The space will automatically use the provided agent/Dockerfile and run on port 7860.

Frontend Setup

  1. Navigate to the frontend directory:
    cd frontend
  2. Install dependencies:
    npm install
  3. Start the development server:
    npm run dev

☁️ Cloud vs. Local Deployment

Samvidha features a Zero-Config Lifecycle Manager. It automatically detects your deployment environment based on your .env configuration:

  • Local Developer Mode: If no cloud URIs are detected, the API automatically pulls and launches Neo4j and Qdrant Docker containers on startup. It handles the entire lifecycle (start/stop) and persists data to local volumes.
  • Hybrid Cloud Mode: If a NEO4J_URI (Aura) or QDRANT_ENDPOINT (Cloud) is provided, the system suppresses the local Docker lifecycle and establishes high-performance remote connections. This is recommended for production-grade reliability and larger datasets.

🔍 Knowledge Graph Relationships

Unlike traditional RAG systems that treat law as flat text, Samvidha models the Constitution as a Directed Acyclic Graph (DAG) of legal effects:

Relationship Description Data Impact
AMENDS A standard modification to text. Updates property values.
OVERRIDES A "Notwithstanding" clause. Prioritizes target node in logic.
SUSPENDS Operational pause (e.g., Art 358). Marks target as "Inactive" in context.
SUBSTITUTES Full textual replacement. Deprecates previous versions.
REPEALS/OMITS Removal from the Constitution. Logic-gate for removal from current law.
REFERS_TO Cross-article citation. Triggers automatic multi-hop retrieval.

🖥️ Interactive Visualization

The platform features a proprietary Nexus Discovery Interface:

  • 3D Intelligence Map: A live Three.js visualization of the entire constitutional network. Nodes are color-coded (Red: Amendment, Yellow: Article, Blue: Clause).
  • Temporal Timeline: A scrubbable timeline that allows users to travel back to any year (e.g., 1976) and see exactly which provisions existed at that point in history.
  • System Execution Trace: A real-time terminal readout of the LangGraph agent's thinking process, providing transparency into how the AI is fetching and validating facts.
  • Adaptive Architecture: A fully responsive mobile experience that transforms the Legal Dossier into a Bottom Sheet and optimizes the 3D Graph into a low-power, non-interactive cinematic background on smaller viewports.

🧠 Key Intelligence Services

Graph Planner (graph_planner.py)

Generates 1-4 distinct Cypher queries based on the query type (Discovery Mode vs. Deep Dive). It manages type-safety (integers for amendments, strings for articles) to ensure zero-miss retrieval.

Legal Reasoner (legal_reasoner.py)

The "Supreme Court Researcher" agent. Enforces a strict "Hierarchy of Truth" where Graph relationships (DELETED, AMENDS) override semantic chunks if contradictions occur.

Hybrid Retrieval (hybrid_retrieval.py)

Fuses graph structural data with keyword search. It applies "temporal boosting" to ensure that the latest amendments are prioritized in the final answer synthesis.


🛡️ Validation & Ingestion

The system includes dedicated scripts for high-fidelity data population:

  • graph_ingestion.py: Uses multi-pass LLM extraction to identify 9 relationship types (AMENDS, OVERRIDES, SUSPENDS, etc.) from legal summaries.
  • validate_ingestion.py: Verifies graph completeness by comparing node counts against expected legal provisions.

📜 Legal Disclaimer

Samvidha is an AI-powered intelligence tool designed for research assistance. While it maintains a high degree of precision through GraphRAG, it should be used to augment, not replace, professional legal consultation. Please note that AI-generated answers may be incorrect, partially correct, or missing critical nuances; always verify findings against official Gazette notifications.

About

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors