Skip to content

Terminal AI chat app with live streaming to a Next.js web UI

Notifications You must be signed in to change notification settings

markelca/chat.ai

Repository files navigation

AI Chat - Terminal-based AI Chat Application

A simple terminal-based AI chat application that supports multiple providers including Ollama and OpenRouter with streaming response support.

Project Structure

This is a monorepo managed with pnpm workspaces:

  • packages/cli - Terminal application
  • packages/web - Next.js web interface
  • packages/shared - Shared types and storage abstractions

Features

  • πŸš€ Simple REPL interface
  • πŸ”„ Streaming responses (display tokens as they arrive)
  • πŸ’¬ Conversation history within a session
  • πŸ“‹ Session Management - Create, resume, and navigate between multiple chat sessions
  • πŸ”Œ Support for multiple providers (Ollama and OpenRouter)
  • πŸ’Ύ Pluggable storage backends (in-memory or Redis)
  • πŸ–₯️ Pluggable output/view layer (stdout, Redis pub/sub, composite)
  • 🌐 Real-time web streaming - View conversations in a web browser via SSE
  • 🎯 Web UI with session navigation - Browse all sessions with sidebar navigation
  • βš™οΈ JSON-based configuration
  • 🐳 Docker support for Redis and web app
  • 🎨 Colored terminal output

Installation

Prerequisites

  • Node.js 18+ (for native fetch support)
  • pnpm package manager

Setup

  1. Clone the repository or download the source code

  2. Install dependencies for all packages:

pnpm install
  1. Build all packages:
pnpm build

Or build specific packages:

pnpm build:cli
pnpm build:web

Docker Setup

The project includes Docker support for running Redis and the web streaming application.

Starting Services with Docker

# Start all services (Redis + Web app)
docker compose up -d

# Start only Redis
docker compose up -d redis

# Check services are running
docker compose ps

# View logs
docker compose logs -f         # All services
docker compose logs -f redis   # Redis only
docker compose logs -f web     # Web app only

# Stop all services
docker compose down

The Redis container will:

  • Run on port 6379
  • Persist data in a Docker volume named redis-data
  • Use AOF (Append Only File) persistence for durability

The web container will:

  • Run on port 3000
  • Connect to Redis for message streaming
  • Provide SSE endpoint at /api/stream

Configuration

The application will create a default config file on first run at one of these locations:

  • ./config.json (local directory, checked first)
  • ~/.config/ai-chat/config.json (global config)

You can also copy config.example.json to config.json and modify it:

{
  "ollama": {
    "baseUrl": "http://localhost:11434",
    "model": "llama2"
  },
  "openrouter": {
    "baseUrl": "https://openrouter.ai/api/v1",
    "model": "anthropic/claude-3.5-sonnet",
    "apiKey": "sk-or-v1-YOUR_API_KEY_HERE"
  },
  "defaults": {
    "provider": "ollama"
  }
}

Configuration Options

Provider Settings

  • ollama.baseUrl: The URL of your Ollama instance (default: http://localhost:11434)
  • ollama.model: The Ollama model to use (e.g., llama2, mistral, codellama)
  • openrouter.baseUrl: The OpenRouter API URL (default: https://openrouter.ai/api/v1)
  • openrouter.model: The OpenRouter model to use (e.g., anthropic/claude-3.5-sonnet)
  • openrouter.apiKey: Your OpenRouter API key (required for OpenRouter)
  • defaults.provider: Default provider to use (ollama or openrouter)

Redis Storage Settings

  • redis.enabled: Enable Redis for persistent conversation history (default: false)
  • redis.host: Redis server hostname (default: localhost)
  • redis.port: Redis server port (default: 6379)
  • redis.password: Redis password (optional)
  • redis.username: Redis username for ACL (optional, Redis 6+)
  • redis.database: Redis database number 0-15 (default: 0)
  • redis.sessionName: Unique name for the conversation session (default: default-session)
    • If set to default-session and no REDIS_SESSION_NAME env var, a unique session ID is auto-generated
    • To use named sessions, either set a custom value in config or use REDIS_SESSION_NAME env var
  • redis.ttl: Time-to-live in seconds for conversation data (optional, no expiration if not set)

Environment Variable Overrides

All Redis settings can be overridden using environment variables:

  • REDIS_ENABLED, REDIS_HOST, REDIS_PORT, REDIS_PASSWORD
  • REDIS_USERNAME, REDIS_DATABASE, REDIS_SESSION_NAME, REDIS_TTL

See .env.example for a complete list with descriptions.

Usage

Development Mode

Run the CLI in development mode with hot-reload:

pnpm dev:cli

Run the web app:

pnpm dev:web

Production Mode

After building, run the CLI:

cd packages/cli
pnpm start

Or link the CLI globally:

cd packages/cli
pnpm link --global
ai-chat

Command Line Options

# Use a specific provider
ai-chat --provider ollama
ai-chat --provider openrouter

# Use a specific model (overrides config)
ai-chat --provider ollama --model codellama
ai-chat --provider openrouter --model anthropic/claude-3-opus

# Resume or create a named session
ai-chat --session my-project
ai-chat --session frontend-work

# Combine options
ai-chat -p ollama -m llama2 -s research

# Short options
ai-chat -p ollama -m llama2 -s my-session

Session Management

The --session (or -s) flag allows you to create, resume, and manage multiple chat sessions. See the Session Management guide for detailed information.

REPL Commands

Once in the chat:

  • Type your message and press Enter to send
  • /quit or /exit - Exit the application
  • /clear - Clear conversation history
  • /help - Show available commands

Examples

Using Ollama

Make sure Ollama is running locally:

ollama serve

Then start the chat:

ai-chat --provider ollama

Using OpenRouter

Make sure you have set your API key in the config file, then:

ai-chat --provider openrouter

Using Redis for Persistent Conversations

  1. Start Redis using Docker:
docker compose up -d
  1. Enable Redis in your config.json:
{
  "redis": {
    "enabled": true,
    "host": "localhost",
    "port": 6379,
    "sessionName": "my-conversation"
  }
}
  1. Run the CLI:
pnpm dev:cli

Your conversation history will now persist in Redis. Stop and restart the application to see your history restored.

Alternatively, use environment variables without modifying config:

REDIS_ENABLED=true REDIS_SESSION_NAME=my-session pnpm dev:cli

Web Streaming

Stream your CLI conversation to a web browser in real-time using Server-Sent Events (SSE).

For setup instructions and configuration details, see the Web Streaming guide.

License

ISC

About

Terminal AI chat app with live streaming to a Next.js web UI

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published