Skip to content

Inference-Foundry/super-ollama

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5,273 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

super-ollama

super-ollama is a terminal-native local LLM tool built on the same inference stack as Ollama. It runs models in process with no HTTP server for day-to-day use: you talk to the model through the super-ollama CLI.

Features (today)

  • super-ollama ask — one-shot prompt (args or stdin)
  • super-ollama chat — interactive chat (streaming)
  • super-ollama config — show config path and default model
  • ollama — unchanged upstream-style binary for ollama serve, ollama pull, ollama run, etc.

Default model is gemma3:1b unless you set default_model in config or pass --model.

Quick start

make build-super-ollama
# or: go build -o bin/super-ollama ./cmd/super-ollama

./bin/super-ollama ask "Explain what a goroutine is in one paragraph." --model gemma3:1b
./bin/super-ollama chat --model gemma3:1b

Pull models with the ollama CLI (starts from this repo with go build -o ollama .) while ollama serve is not required for super-ollama ask / chat.

Configuration

Location Purpose
$XDG_CONFIG_HOME/super-ollama/config.toml If XDG_CONFIG_HOME is set
~/.super-ollama/config.toml Otherwise

Example:

default_model = "gemma3:1b"

Logging

By default super-ollama logs at WARN on stderr so scheduler noise stays quiet. For full detail:

OLLAMA_DEBUG=1 ./bin/super-ollama ask "hello"

Build and test

go build -o ollama .
make build-super-ollama
go test ./...

Documentation

  • GitHub Wiki — roadmap and setup (enable Wiki under repo Settings if the link is empty)
  • docs/wiki/ — Markdown source; publish to GitHub Wiki with ./scripts/publish-wiki.sh after you create the first wiki page in the browser (GitHub only provisions the wiki git remote after that)
  • super-ollama-agent-plan.md — full phased product plan
# one-time: open wiki in browser, add a Home page, save
gh browse -w -R Kritarth-Dandapat/ollama-custom
./scripts/publish-wiki.sh

License

MIT (inherits upstream licensing for vendored/submodule portions; see repository files for detail).

About

No description, website, or topics provided.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Contributors

Languages

  • Go 87.4%
  • C 11.1%
  • Shell 0.6%
  • PowerShell 0.4%
  • CMake 0.3%
  • Dockerfile 0.1%
  • Other 0.1%