Skip to content

arvindjuneja/OwnAI

Repository files navigation

🧠 OwnAI

Platform: macOS Swift 5.9 SwiftUI 3.0 License: MIT

A native macOS interface for your local Ollama instance. Run AI models on your hardware with a beautiful, native interface.

OwnAI Main Interface

OwnAI Model Selection

✨ Features

OwnAI brings a clean, native macOS interface to Ollama instances running on your machine or local network:

  • 🔌 Local Connection Management: Connect to any Ollama instance (local or on your network)
  • 🤖 Model Selection: Choose from your installed Ollama models
  • 💬 Chat Interface: Natural conversation UI with streaming responses
  • 📊 Stats Monitoring: Track generation speed and token usage in verbose mode
  • 📝 Formatting Support: Code snippets, terminal output, and more
  • 💾 Session Management: Save and load conversations

🚀 Getting Started

Prerequisites

  • macOS 13.0 or later
  • Ollama installed and running on your Mac or accessible on your network
  • Xcode 15+ (for building from source)

Installation

Option 1: Download Release

  1. Download the latest release from the Releases page
  2. Move OwnAI to your Applications folder
  3. Launch and enjoy!

Option 2: Build from Source

  1. Clone this repository
    git clone https://github.com/arvindjuneja/OwnAI.git
    cd OwnAI
  2. Open the project in Xcode
    open ownai/ownai.xcodeproj
  3. Build the project (⌘+B) and run (⌘+R)

🔧 Usage

  1. Start Ollama: Ensure your Ollama server is running (default: http://localhost:11434)
  2. Launch OwnAI: Open the app
  3. Configure Connection: Click the settings icon and enter your Ollama server details
  4. Select a Model: Choose from your available models
  5. Start Chatting: Begin your conversation with the selected model

🛣️ Roadmap

OwnAI is under active development. Upcoming features include:

  • Onboarding splash screen to clarify app dependencies (Ollama server)
  • Local network scanning for Ollama instances
  • System prompt customization
  • Model management (pull/remove models directly from UI)
  • Conversation search and organization
  • Advanced formatting options for responses
  • Global keyboard shortcuts

View our complete development roadmap for more details on planned features and progress.

🤝 Contributing

Contributions are welcome! Feel free to:

  • Report bugs
  • Suggest features
  • Submit pull requests

Please check our contribution guidelines before getting started.

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Ollama for making local LLMs accessible
  • All contributors and supporters of the project

Made with ❤️ by Arvind Juneja

About

Local LLM (using Ollama) interface for MacOS

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages