A Go framework and application for building AI agents with LLM integration, tool execution, and multi-agent orchestration.
Two ways to use agent-forge:
- Go Library - Import
github.com/thinktwiceco/agent-forgeinto your Go projects to build custom AI agents - localforge Application - Pre-built server with web UI for local agent orchestration
Library:
- Simple agent creation with fluent API
- Extensible tool system with custom tool support
- Multi-agent teams with delegation pattern
- Real-time streaming responses
- Multiple LLM providers (OpenAI, DeepSeek, TogetherAI)
- Built-in tools (filesystem, git, web, postgres, vector DB)
- Plugin system for extending functionality
- Conversation persistence and history management
Application:
- HTTP/SSE API server
- Web-based chat interface
- Pre-configured agent workflows
- Multi-agent orchestration
- Conversation management
- File uploads and knowledge integration
- Go 1.21 or later
golangci-lintfor linting (install:brew install golangci-lintor see docs)
git clone https://github.com/thinktwiceco/agent-forge
cd agent-forge
go mod download./scripts/test.sh --unit./scripts/lint.shCI runs on all PRs and pushes to main. See .github/workflows/ci.yml for details.
Releases follow semantic versioning. See docs/RELEASE_PIPELINE.md for the complete release process.
go get github.com/thinktwiceco/agent-forgesrc/
βββ agents/ # Agent creation and execution
βββ builder/ # Config-driven agent builder
βββ core/ # Core interfaces (Tool, Plugin, SubAgent)
βββ llms/ # LLM provider integrations
βββ tools/ # Built-in tools (fs, git, web, postgres, api, vector)
βββ plugins/ # Plugin system and built-in plugins
βββ history/ # Conversation history management
βββ telemetry/ # Observability (tool exec, tokens, truncation)
See docs/FILE_STRUCTURE.md for details.
Three approaches:
| Method | Description |
|---|---|
agents.NewAgent(config) |
Direct instantiation with AgentConfig |
agents.NewBuilder(llm, name) |
Fluent API builder |
builder.NewAgentBuilderFromConfig(path) |
YAML config-driven |
Example (fluent API):
agent, err := agents.NewBuilder(llm, "my-agent").
WithSystemPrompt("You are a helpful assistant.").
WithTools(tool1, tool2).
WithPersistence("json").
Build()Example (YAML config):
import "github.com/thinktwiceco/agent-forge/src/builder"
agentBuilder, err := builder.NewAgentBuilderFromConfig("config.yaml")
agent, err := agentBuilder.Build()See docs/AGENT_BUILDER.md for comprehensive documentation.
| Tool | Factory Function | Description |
|---|---|---|
| File System | fs.NewFsTool(root) |
File operations within root directory |
| Git | git.NewGitTool(repoRoot) |
Git operations (add, commit, push, etc.) |
| Web | web.NewWebTool(workingDir) |
Web automation with browser sessions; headless by default |
| Postgres | postgres.NewPostgresTool(url, mode, tables, schemas) |
Database operations with whitelisting |
| API | api.NewApiTool(name, endpoints, authHook) |
HTTP API calls with auth support |
| Vector | vector.NewVectorTool(db, embeddings) |
Semantic search and indexing |
| Subagent | Constructor | Description |
|---|---|---|
| Reasoning | agents.ReasoningAgent(llm) |
Analyzes questions, finds ambiguities |
| OS | agents.OsAgent(llm, root) |
File system and OS tasks |
| Git | agents.GitAgent(llm, workingDir) |
Git repository operations |
| Coding | agents.CodingAgent(llm, root) |
Code generation and analysis |
| Web | agents.WebAgent(llm, workingDir) |
Web navigation and automation |
| Vision | agents.VisionAgent(llm, workingDir) |
Loads images and answers visual questions (requires vision-capable model, e.g. gpt-4o) |
Add subagents with:
agent.AddSystemAgent(agents.ReasoningAgent(llm))
agent.AddSystemAgent(agents.WebAgent(llm, workingDir))import "github.com/thinktwiceco/agent-forge/src/core"
tool := &core.Tool{
Name: "calculate",
Description: "Performs calculations",
Parameters: []core.Parameter{
{Name: "expression", Type: "string", Required: true},
},
Handler: func(ctx map[string]any, args map[string]any) llms.ToolReturn {
result := evaluate(args["expression"].(string))
return core.NewSuccessResponse(result)
},
}
agent.AddTools([]llms.Tool{tool})Plugins extend agents with tools, hooks, and system prompts. Available plugins: logger, todo, vault, procedures, knowledge.
agent:
plugins:
- "logger"
- "todo"See src/plugins/README.md for creating custom plugins.
All agent responses stream in real-time:
responseCh := agent.ChatStream("What is the capital of France?", "")
for chunk := range responseCh.Start() {
switch chunk.Type {
case llms.TypeContent:
fmt.Print(chunk.Content)
case llms.TypeToolExecuting:
fmt.Printf("Executing: %s\n", chunk.ToolExecuting.Name)
case llms.TypeToolResult:
fmt.Printf("Result: %s\n", chunk.ToolResults[0].Result)
}
}Three builder patterns for different use cases:
| Builder | Use Case | Example |
|---|---|---|
agents.NewBuilder() |
Fluent API in code | agents.NewBuilder(llm, "name").WithTools(...).Build() |
builder.NewAgentBuilder() |
Programmatic config | b := builder.NewAgentBuilder("name", "json"); b.SetModel(...); b.Build() |
builder.NewAgentBuilderFromConfig() |
YAML-driven config | builder.NewAgentBuilderFromConfig("config.yaml") |
See docs/AGENT_BUILDER.md for full builder documentation.
package main
import (
"context"
"fmt"
"github.com/thinktwiceco/agent-forge/src/agents"
"github.com/thinktwiceco/agent-forge/src/llms"
)
func main() {
ctx := context.Background()
// Create LLM engine
multiModelLLM, _ := llms.NewOpenAILLMBuilder("togetherai").
SetModel(llms.TOGETHERAI_Llama3170BInstructTurbo).
SetCtx(ctx).
Build()
llm := multiModelLLM.MainModel()
// Create agent
agent := agents.NewAgent(&agents.AgentConfig{
LLMEngine: llm,
AgentName: "Assistant",
SystemPrompt: "You are a helpful AI assistant.",
MainAgent: true,
})
// Chat with streaming
responseCh := agent.ChatStream("Hello! How can you help me?", "")
for chunk := range responseCh.Start() {
if chunk.Content != "" {
fmt.Print(chunk.Content)
}
}
}package main
import (
"context"
"fmt"
"github.com/thinktwiceco/agent-forge/src/agents"
"github.com/thinktwiceco/agent-forge/src/core"
"github.com/thinktwiceco/agent-forge/src/llms"
"github.com/thinktwiceco/agent-forge/src/tools/fs"
)
func main() {
ctx := context.Background()
// Initialize LLM
multiModelLLM, _ := llms.NewOpenAILLMBuilder("togetherai").
SetModel(llms.TOGETHERAI_Llama3170BInstructTurbo).
SetCtx(ctx).
Build()
llm := multiModelLLM.MainModel()
// Create custom tool
calcTool := &core.Tool{
Name: "calculate",
Description: "Performs calculations",
Parameters: []core.Parameter{
{Name: "expression", Type: "string", Required: true},
},
Handler: func(ctx map[string]any, args map[string]any) llms.ToolReturn {
return core.NewSuccessResponse("Result: 42")
},
}
// Create agent with tools and subagents
agent, _ := agents.NewBuilder(llm, "MathAssistant").
WithSystemPrompt("You are a helpful math assistant.").
WithTools(calcTool, fs.NewFsTool("/tmp")).
WithPersistence("json").
AsMainAgent().
Build()
agent.AddSystemAgent(agents.ReasoningAgent(llm))
// Chat with streaming
responseCh := agent.ChatStream("What is 15 multiplied by 23?", "")
for chunk := range responseCh.Start() {
if chunk.Content != "" {
fmt.Print(chunk.Content)
}
}
}localforge is a standalone server application that provides:
- HTTP/SSE API server for agent interactions
- Web-based chat interface
- Pre-configured agent workflows
- Conversation persistence and history
- Multi-agent orchestration
- File uploads and knowledge integration
Chat interface β Real-time streaming, conversation history, active tasks:
Settings β Agent identity, sub-agents, plugins, and API keys:
Knowledge graph β Node types, filters, and visualization:
curl -fsSL https://raw.githubusercontent.com/thinktwiceco/agent-forge/main/scripts/install-release.sh | bash -s -- ./my-agent
cd my-agent
# Edit config.yaml (model, system_prompt, tools, subagents)
# Add API keys to .env
./start.shThis creates:
bin/localforge- The executableconfig.yaml- Agent configuration.env- API keys and secretsdata/- Conversation historyprocedures/- Multi-phase workflowsstart.sh- Convenience launcher
git clone https://github.com/thinktwiceco/agent-forge
cd agent-forge/cmd/localforge
go build -o localforge src/*.go
./localforge -config config.yaml -port 8080Create a config.yaml file:
agent:
name: "My Agent"
model: "togetherai::moonshotai/Kimi-K2.5"
system_prompt: |
You are a helpful assistant.
working_dir: "${AGENT_WORKING_DIR}"
persistence: "json"
tools:
- name: fs
- name: web
subagents:
reasoning: "deepseek::deepseek-reasoner"
plugins:
- "todo"
- "vault"Create a .env file:
AF_OPENAI_API_KEY=your-key
AF_DEEPSEEK_API_KEY=your-key
AF_TOGETHERAI_API_KEY=your-key
AGENT_WORKING_DIR=/path/to/working/dirSee src/builder/README.md for full configuration options.
Navigate to http://localhost:8080 after starting the server.
Features:
- Real-time chat with streaming responses
- Conversation history and management
- Settings page (
/settings) - Agent config, providers, tools - Knowledge page (
/knowledge) - Knowledge graph visualization - File uploads
- Todo list management
Chat:
# Start new conversation
POST /api/chat
{"message": "Hello!"}
# Resume conversation
POST /api/chat?conversationId={uuid}
{"message": "Continue..."}
POST /api/chat/stop # Stop in-progress chat
GET /api/chat/push # SSE push for real-time updatesConversations:
GET /api/conversations # List conversations
GET /api/conversations/:id # Get conversation
DELETE /api/conversations/:id # Delete conversation
PUT /api/conversations/:id/title # Rename conversationConfiguration:
GET /api/config # Get agent configuration
PUT /api/config # Update agent config
PUT /api/config/tools/:name # Update tool config
PUT /api/config/plugins # Update plugins list
PUT /api/config/subagents # Update subagents
GET /api/config/providers # Get push providers (Instagram, Telegram)
PUT /api/config/providers # Update provider settings
POST /api/agent/reload # Reload agent from configOther:
GET /api/todos- Get todosPOST /api/upload- Upload filesGET /api/fs/list,GET /api/fs/read- FS visualizationGET /api/knowledge/graph,GET /api/knowledge/stats,GET /api/knowledge/node/:id- Knowledge graphPOST /api/webhooks/:provider- Webhook receiver (Instagram, Telegram)POST /api/webhooks/:provider/sync- Webhook sync
working_dir/
βββ data/
β βββ conversations/ # Conversation history (JSON)
βββ repos/ # Git tool cloned repos
βββ web/ # Web tool saved content
βββ vault/ # Vault plugin encrypted secrets
βββ procedures/ # Procedures plugin manifests
| Field | Type | Required | Description | Default |
|---|---|---|---|---|
name |
string | Yes | Agent name | - |
model |
string | Yes | LLM model (provider::model) |
- |
system_prompt |
string | No | System prompt | - |
working_dir |
string | No | Working directory for tools/plugins | - |
persistence |
string | No | Conversation persistence | "" (none) |
tools |
array | No | List of tools to enable | [] |
subagents |
map | No | Map of subagent name to model | {} |
plugins |
array | No | List of plugin identifiers | [] |
Model Format: provider::model-name
Supported Providers and Models:
| Provider | Identifier | Environment Variable | Models |
|---|---|---|---|
| OpenAI | openai |
AF_OPENAI_API_KEY |
gpt-5, gpt-5.1, gpt-5.2 |
| DeepSeek | deepseek |
AF_DEEPSEEK_API_KEY |
deepseek-chat, deepseek-reasoner |
| TogetherAI | togetherai |
AF_TOGETHERAI_API_KEY |
meta-llama/Llama-3.2-3B-Instruct-Turbo, meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo, Qwen/Qwen2.5-7B-Instruct-Turbo, Qwen/Qwen3-Coder-480B-A35B-Instruct-FP8, openai/gpt-oss-120b, zai-org/GLM-4.7, moonshotai/Kimi-K2.5 |
| Tool | Identifier | Required Parameters | Optional Parameters |
|---|---|---|---|
| File System | fs |
root (string) |
- |
| Git | git |
root (string) |
- |
| Web | web |
root (string) |
- |
| Postgres | postgres |
postgresURL (string), mode (string), allowedTables (array) |
allowedSchemas (array) |
| API | api |
endpoints (array) |
onApiCallHook (string) |
| Vector | vector |
(requires vector-storage section) | - |
Example:
tools:
- name: fs
root: "/path/to/sandbox"
- name: postgres
postgresURL: "postgresql://user:pass@host:5432/db"
mode: "read" # or "write"
allowedTables: ["users", "products"]
allowedSchemas: ["public"]| Subagent | Identifier | Requirements | Description |
|---|---|---|---|
| Reasoning | reasoning |
Model spec | Analyzes questions, finds ambiguities |
| OS | os |
Model spec, working_dir | File system and OS operations |
| Git | git |
Model spec, working_dir | Git repository operations |
| Web | web |
Model spec, working_dir | Web navigation and automation |
| Coding | coding |
Model spec, working_dir | Code generation and analysis |
| Vision | vision |
Model spec (vision-capable, e.g. gpt-4o), working_dir | Loads images and answers visual questions |
Example:
subagents:
reasoning: "deepseek::deepseek-reasoner"
web: "deepseek::deepseek-chat"
git: "deepseek::deepseek-chat"| Plugin | Identifier | Description | Configuration |
|---|---|---|---|
| Logger | logger |
Formatted output with colors | None |
| Todo | todo |
Task management | None |
| Vault | vault |
Encrypted secret storage | Requires VAULT_MASTER_KEY env var |
| Procedures | procedures |
Multi-phase workflows | Auto-scans procedures/ directory |
| Knowledge | knowledge |
Knowledge graph integration | None |
Example:
plugins:
- "logger"
- "todo"
- "vault"Required when using vector tool.
| Field | Type | Required | Description | Default |
|---|---|---|---|---|
vector_db |
string | Yes | Database type | - |
embedding_model |
string | Yes | Embedding model | - |
sqlite.db_path |
string | If using SQLite | Path to SQLite DB file | - |
sqlite.vector_dim |
integer | No | Vector dimensions | 1536 |
milvus.host |
string | If using Milvus | Milvus server host | localhost |
milvus.port |
integer | If using Milvus | Milvus server port | 19530 |
milvus.vector_dim |
integer | No | Vector dimensions | 1536 |
Example:
vector-storage:
vector_db: "sqlite"
embedding_model: "openai::text-embedding-3-small"
sqlite:
db_path: "./vector.db"
vector_dim: 1536| Variable | Description | Required | Default |
|---|---|---|---|
AF_OPENAI_API_KEY |
OpenAI API key | If using OpenAI | - |
AF_DEEPSEEK_API_KEY |
DeepSeek API key | If using DeepSeek | - |
AF_TOGETHERAI_API_KEY |
TogetherAI API key | If using TogetherAI | - |
AF_LOG_LEVEL |
Log level | No | INFO |
AF_LOG_FILE |
Log file path | No | - |
VAULT_MASTER_KEY |
Vault encryption key (base64-encoded 32 bytes) | If using vault plugin | - |
Generate VAULT_MASTER_KEY with: openssl rand -base64 32
Contributions are welcome! Please feel free to submit a Pull Request.
[Add your license here]






