Skip to content

Phase 6: ~/.super-ollama/ layout, --no-profile flag, install.sh, build matrix#11

Draft
Copilot wants to merge 3 commits intomainfrom
copilot/phase-6-profile-prompts-packaging
Draft

Phase 6: ~/.super-ollama/ layout, --no-profile flag, install.sh, build matrix#11
Copilot wants to merge 3 commits intomainfrom
copilot/phase-6-profile-prompts-packaging

Conversation

Copy link
Copy Markdown

Copilot AI commented Apr 11, 2026

Implements the full Phase 6 scope: file-based profile/prompt configuration under ~/.super-ollama/, profile injection into every inference call with an opt-out flag, a cross-platform install script, and a CI build matrix.

internal/config

  • Extended Config struct with all config.toml fields: EmbeddingModel, DBPath, TodoPath, CaptureIntervalMinutes, CaptureEnabled
  • Added Dir() — resolves ~/.super-ollama/ (honours XDG_CONFIG_HOME)
  • Added LoadProfile() — reads profile.md; silent no-op when absent
  • Added LoadPrompt(name) — reads prompts/<name>.md; silent no-op when absent
  • Added Init() — creates the directory skeleton with default files; idempotent

internal/engine

  • Generate and StreamGenerate now accept a system string param forwarded to api.GenerateRequest.System

cmd/super-ollama

  • Added --no-profile persistent flag (suppresses profile.md injection on all commands)
  • ask: injects profile.md + prompts/ask.md as the system prompt
  • chat: prepends profile as a system message before the conversation loop
  • Added config init subcommand; config show now displays embedding_model, directory path, and profile status
super-ollama --no-profile ask "summarise this"   # skip profile injection
super-ollama --model llama3:8b ask "hello"       # override model per-call
super-ollama config init                          # scaffold ~/.super-ollama/

install.sh

One-liner cross-platform installer: detects OS/arch, downloads the release binary, and runs Init() equivalent to scaffold ~/.super-ollama/.

.github/workflows/build-matrix.yml

Builds super-ollama for linux/{amd64,arm64}, darwin/{amd64,arm64}, and windows/amd64. Separate test-prompt-pipeline job runs internal/config and cmd/config tests on all three OSes.

Copilot AI and others added 2 commits April 11, 2026 14:01
Agent-Logs-Url: https://github.com/Kritarth-Dandapat/super-ollama/sessions/6030480b-e014-4837-9a6c-862db7a47c33

Co-authored-by: Kritarth-Dandapat <141005022+Kritarth-Dandapat@users.noreply.github.com>
Copilot AI changed the title [WIP] Implement profile and prompts packaging in Phase 6 Phase 6: ~/.super-ollama/ layout, --no-profile flag, install.sh, build matrix Apr 11, 2026
Copilot AI requested a review from Kritarth-Dandapat April 11, 2026 14:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Phase 6: profile, prompts, packaging

2 participants