A native macOS interface for your local Ollama instance. Run AI models on your hardware with a beautiful, native interface.
OwnAI brings a clean, native macOS interface to Ollama instances running on your machine or local network:
- 🔌 Local Connection Management: Connect to any Ollama instance (local or on your network)
- 🤖 Model Selection: Choose from your installed Ollama models
- 💬 Chat Interface: Natural conversation UI with streaming responses
- 📊 Stats Monitoring: Track generation speed and token usage in verbose mode
- 📝 Formatting Support: Code snippets, terminal output, and more
- 💾 Session Management: Save and load conversations
- macOS 13.0 or later
- Ollama installed and running on your Mac or accessible on your network
- Xcode 15+ (for building from source)
- Download the latest release from the Releases page
- Move OwnAI to your Applications folder
- Launch and enjoy!
- Clone this repository
git clone https://github.com/arvindjuneja/OwnAI.git cd OwnAI - Open the project in Xcode
open ownai/ownai.xcodeproj
- Build the project (⌘+B) and run (⌘+R)
- Start Ollama: Ensure your Ollama server is running (default:
http://localhost:11434) - Launch OwnAI: Open the app
- Configure Connection: Click the settings icon and enter your Ollama server details
- Select a Model: Choose from your available models
- Start Chatting: Begin your conversation with the selected model
OwnAI is under active development. Upcoming features include:
- Onboarding splash screen to clarify app dependencies (Ollama server)
- Local network scanning for Ollama instances
- System prompt customization
- Model management (pull/remove models directly from UI)
- Conversation search and organization
- Advanced formatting options for responses
- Global keyboard shortcuts
View our complete development roadmap for more details on planned features and progress.
Contributions are welcome! Feel free to:
- Report bugs
- Suggest features
- Submit pull requests
Please check our contribution guidelines before getting started.
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for making local LLMs accessible
- All contributors and supporters of the project
Made with ❤️ by Arvind Juneja

