Run advanced AI models directly in your browser. Private, secure, and purely local.
FeelyAI is a powerful, privacy-focused AI chat application that runs entirely in your browser using WebLLM. No server costs, no data sent to external servers, and 100% client-side execution.
π Try Live Demo
Experience FeelyAI in action at feelyai.com
- All AI processing happens directly in your browser using WebGPU
- No data sent to external servers
- No API keys required
- Complete privacy and security
Choose from several optimized language models:
- Llama 3.2 3B - Fast and efficient general-purpose model
- Hermes 3 (Llama 3.1 8B) - Advanced reasoning and tool use
- TinyLlama 1.1B - Ultra-lightweight for quick responses
- Qwen2.5-Coder-3B - Specialized for coding tasks
- Connect to MCP servers to extend AI capabilities
- Add custom tools and functions
- Support for custom headers and authentication
- Enable/disable servers on the fly
- Real-time tool discovery and schema introspection
FeelyAI includes powerful built-in tools:
evalCode- Execute JavaScript code directly in the browser (with safe eval mode)listTools- Discover available tools with search and regex filteringgetToolSchema- Inspect tool schemas before use
- Schema Discovery Pattern - AI automatically discovers and learns tool schemas
- Validation & Error Handling - Automatic validation with helpful error messages
- Permission System - Granular control over tool execution
- Allow once
- Allow for session
- Always allow
- Safe Eval Mode - Run code in isolated Web Workers for security
- Projects & Chats - Organize conversations into projects
- π Encrypted Projects - Password-protect sensitive projects with client-side AES-GCM encryption
- All chats, messages, and system prompts encrypted locally
- Auto-lock on page reload for security
- Passwords hashed with bcryptjs (never stored in plain text)
- Unique encryption keys per project using PBKDF2
- Persistent Storage - All chats saved locally using IndexedDB
- Auto-titling - Chats automatically named from first message
- Message History - Full conversation history with tool calls and outputs
- Markdown Support - Rich text rendering with syntax highlighting
- Per-Project System Prompts - Each project can have its own custom system instructions
- Edit system instructions to customize AI behavior
- Template variables for dynamic tool injection (
{{listTools}},{{tool_names}}) - Reset to defaults anytime
- Persistent across sessions
- Encrypted along with project data for password-protected projects
- Clean, dark-themed interface
- Real-time progress indicators during model loading
- Collapsible tool outputs for better readability
- Responsive design
- Smooth animations and transitions
- Switch between models on the fly
- Automatic model download and caching
- Progress tracking during model initialization
- Modern browser with WebGPU support (Chrome 113+, Edge 113+)
- Sufficient RAM (4GB+ recommended)
- GPU with WebGPU support
- Clone the repository:
git clone https://github.com/hasmcp/feelyai.git
cd feelyai- Install dependencies:
npm install- Run the development server:
npm run dev- Open your browser and navigate to
http://localhost:5173
npm run build
npm run preview- Click the "Add Server" button in the sidebar
- Enter the MCP server URL (e.g.,
http://localhost:3000/mcp) - Optionally add a name and custom headers
- Click "Add Server"
The AI will automatically discover available tools from connected servers.
When enabled (recommended), JavaScript code execution happens in an isolated Web Worker, preventing access to:
- DOM manipulation
- Local storage
- Cookies
- Parent window context
Control which tools the AI can execute:
- Allow Once - Single execution approval
- Allow for Session - Approve for current session
- Always Allow - Trust tool permanently
- Frontend Framework: Vue 3 (Composition API)
- Build Tool: Vite
- AI Engine: @mlc-ai/web-llm
- MCP Client: @modelcontextprotocol/sdk
- Styling: TailwindCSS 4
- Icons: Lucide Vue
- Markdown: markdown-it
- Storage: IndexedDB (via idb)
- Validation: Ajv (JSON Schema)
- Encryption: bcryptjs + Web Crypto API (AES-GCM)
src/
βββ components/
β βββ AddServerModal.vue # MCP server configuration
β βββ ProjectSidebar.vue # Project/chat navigation
β βββ McpIcon.vue # MCP branding
βββ composables/
β βββ useChat.js # Main chat logic & state
βββ services/
β βββ McpClient.js # MCP protocol client
β βββ MessageStore.js # IndexedDB persistence
β βββ EncryptionService.js # Client-side encryption
βββ workers/
β βββ llm.worker.js # WebLLM worker
β βββ eval.worker.js # Safe code execution
βββ App.vue # Main application
Contributions are welcome! Please feel free to submit a Pull Request.
HasMCP - No-code/low-code MCP server framework with built-in auth, realtime logs and telemetry
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
- WebLLM - Browser-based LLM inference
- Model Context Protocol - Tool integration standard
- MLC AI - Machine Learning Compilation
Usage of LLM models is subject to their respective license agreements. Please review the license terms for each model you use. FeelyAI is provided as-is, and AI-generated content may contain errors or inaccuracies.
Powered by WebLLM β’ No Server Costs β’ 100% Client-Side