Skip to content
/ aimu Public

A Python package containing easy to use tools for working with various language models and AI services.

License

Notifications You must be signed in to change notification settings

saxman/aimu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

278 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PyPI GitHub License Python Version from PEP 621 TOML uv Ruff

AIMU - AI Model Utilities

A Python package containing easy to use tools for working with various language models and AI services. AIMU is specifically designed for running models locally, using Ollama or Hugging Face Transformers. However, it can also be used with cloud models (OpenAI, Anthropic, Google, etc.) with aisuite support (in development).

Features

  • Model Clients: Support for multiple AI model providers including:

  • MCP Tools: Model Context Protocol (MCP) client for enhancing AI capabilities. Provides a simple(r) interface for FastMCP 2.0.

  • Chat Conversation Storage/Management: Chat conversation history management using TinyDB.

  • Prompt Storage/Management: Prompt catalog for storing and versioning prompts using SQLAlchemy.

Components

In addition to the AIMU package in the 'aimu' directory, the AIMU code repository includes:

  • Jupyter notebooks demonstrating key AIMU features.

  • A example chat client, built with Streamlit, using AIMU Model Client, MCP tools support, and chat conversation management.

  • A full suite of Pytest tests.

Installation

AIMU can be installed with Ollama support, Hugging Face Transformers support, and/or aisuite (cloud models) support.

For all features, run:

pip install aimu[all]

Alternatively, for Ollama-only support:

pip install aimu[ollama]

For Hugging Face Tranformers model support:

pip install aimu[hf]

For aisuite models (e.g. OpenAI):

pip install aimu[aisuite]

For accessing potentially gated models via Hugging Face, you'll need to get and store (locally) a Hugging Face Hub access token. Once you have a token, you can install it locally with:

hf auth login

Development

Once you've cloned the repository, run the following command to install all model dependencies:

pip install -e '.[all]'

Additionally, run the following command to install development (testing, linting) and notebook dependencies:

pip install -e '.[dev,notebooks]'

Alternatively, if you have uv installed, you can get all model and development dependencies with:

uv sync --all-extras

Using Pytest, tests can be run for a specific model client and/or model, using optional arguments:

pytest tests\test_models.py --client=ollama --model=GPT_OSS_20B

Usage

Text Generation

from aimu.models import OllamaClient as ModelClient ## or HuggingFaceClient, or AisuiteClient

model_client = ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
response = model_client.generate("What is the capital of France?", {"temperature": 0.7})

Chat

from aimu.models import OllamaClient as ModelClient

model_client = ModelClient(ModelClient.MODELS.LLAMA_3_1_8B)
response = model_client.chat("What is the capital of France?")

Chat UI (Streamlit)

cd streamlit
streamlit run streamlit/chatbot_example.py

MCP Tool Usage

from aimu.tools import MCPClient

mcp_client = MCPClient({
    "mcpServers": {
        "mytools": {"command": "python", "args": ["tools.py"]},
    }
})

mcp_client.call_tool("mytool", {"input": "hello world!"})

MCP Tool Usage with ModelClient

from aimu.models import OllamaClient as ModelClient
from aimu.tools import MCPClient

mcp_client = MCPClient({
    "mcpServers": {
        "mytools": {"command": "python", "args": ["tools.py"]},
    }
})

model_client = ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
model_client.mcp_client = mcp_client

model_client.chat("use my tool please")

Chat Conversation Storage/Management

from aimu.models import OllamaClient as ModelClient
from aimu.memory import ConversationManager

chat_manager = ConversationManager("conversations.json", use_last_conversation=True) # loads the last saved convesation

model_client = new ModelClient(ModelClient.MODEL_LLAMA_3_1_8B)
model_client.messages = chat_manager.messages

model_client.chat("What is the capital of France?")

chat_manager.update_conversation(model_client.messages) # store the updated conversation

Prompt Storage/Management

from aimu.prompts import PromptCatalog, Prompt

prompt_catalog = PromptCatalog("prompts.db")

prompt = Prompt("You are a helpful assistant", model_id="llama3.1:8b", version=1)
prompt_catalog.store_prompt(prompt)

License

This project is licensed under the Apache 2.0 license.

About

A Python package containing easy to use tools for working with various language models and AI services.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published