Skip to content

rchatham/LangTools.swift

Repository files navigation

LangTools

LangTools is a Swift framework for working with Large Language Models (LLMs) and AI services. It provides a unified interface for multiple AI providers and includes tools for building agents and specialized AI assistants.

Features

  • 🤖 Multiple LLM Support: OpenAI, Anthropic, X.AI, Google Gemini, and Ollama
  • 🔧 Unified Interface: Common protocols for working with different AI providers
  • 🤝 Extensible Agent System: Build specialized AI assistants with tools and delegation
  • 📝 Streaming Support: Handle streaming responses from AI models
  • 🛠️ Tool Integration: Add custom capabilities to your AI interactions

Installation

Swift Package Manager

Add LangTools to your Package.swift:

dependencies: [
    .package(url: "https://github.com/rchatham/langtools.swift.git", from: "0.2.0")
]

Basic Usage

// Initialize a language model service
let anthropic = Anthropic(apiKey: "your-key")

// Create a simple chat request
let messages = [
    anthropic.systemMessage("You are a helpful assistant."),
    anthropic.userMessage("Hello!")
]

// Get a response
let response = try await anthropic.perform(request: 
    Anthropic.MessageRequest(
        model: .claude35Sonnet_latest,
        messages: messages
    )
)

print(response.message?.content.text ?? "No response")

Quick Start with LangToolchain

Use LangToolchain to manage multiple AI providers with automatic request routing:

import LangTools
import OpenAI
import Anthropic

// Create and configure the toolchain
var langToolchain = LangToolchain()
langToolchain.register(OpenAI(apiKey: "your-openai-key"))
langToolchain.register(Anthropic(apiKey: "your-anthropic-key"))

// Requests are automatically routed to the correct provider
let openAIRequest = OpenAI.ChatCompletionRequest(
    model: .gpt4,
    messages: [OpenAI.Message(role: .user, content: "Hello from OpenAI!")]
)
let openAIResponse = try await langToolchain.perform(request: openAIRequest)

let anthropicRequest = Anthropic.MessageRequest(
    model: .claude35Sonnet_latest,
    messages: [Anthropic.Message(role: .user, content: "Hello from Anthropic!")]
)
let anthropicResponse = try await langToolchain.perform(request: anthropicRequest)

Quick Start with Agents

struct SimpleAgent: Agent {
    let langTool: Anthropic
    let model: Anthropic.Model
    
    let name = "simpleAgent"
    let description = "A simple agent that responds to user queries"
    let instructions = "You are a simple agent that responds to user queries."
    
    var delegateAgents: [any Agent] = []
    var tools: [any LangToolsTool]? = nil
}

// Use your agent
let agent = SimpleAgent(
    langTool: Anthropic(apiKey: "your-key"), 
    model: .claude35Sonnet_latest
)
let context = AgentContext(messages: [
    LangToolsMessageImpl(role: .user, string: "Hello!")
])
let response = try await agent.execute(context: context)
Simulator.Screen.Recording.-.iPhone.16.-.2025-02-17.at.12.51.45.mp4

Documentation

See individual module README files for detailed documentation and examples:

Contributing

Contributions are welcome! See our Contributing Guide for details.

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Some abstractions around LLM apis.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages