Skip to content

Comments

Add repo-memory to daily-backlog-burner, daily-perf-improver, and daily-qa workflows#190

Open
Copilot wants to merge 3 commits intomainfrom
copilot/use-repo-memory-tool
Open

Add repo-memory to daily-backlog-burner, daily-perf-improver, and daily-qa workflows#190
Copilot wants to merge 3 commits intomainfrom
copilot/use-repo-memory-tool

Conversation

Copy link
Contributor

Copilot AI commented Feb 22, 2026

Three daily workflows that iterate over large issue/PR collections had no persistent memory, causing each run to restart from scratch—risking duplicate work and inability to make steady progress through large backlogs.

Changes

  • daily-backlog-burner — added repo-memory (id: daily-backlog-burner). Phase 2 now reads a backlog cursor on start and writes updated cursor + progress on completion, enabling incremental backlog traversal across runs.

  • daily-perf-improver — added repo-memory (id: daily-perf-improver). Phase 3 goal selection reads prior optimization attempts before picking a new target; writes outcome summary on completion to avoid repeating failed approaches.

  • daily-qa — added repo-memory (id: daily-qa). New step 0 reads QA history (known issues, checked areas) before analysis; new final step persists findings to prevent duplicate issue filing.

  • Docs updated for all three workflows to reflect memory usage.

Follows the same frontmatter pattern and /tmp/gh-aw/repo-memory-{id}/ path convention already established by daily-test-improver and repo-assist.

tools:
  repo-memory:
    - id: daily-backlog-burner
      description: "Tracks backlog progress cursor, processed items, and work history across runs"
      file-glob: ["memory/daily-backlog-burner/*.md", "memory/daily-backlog-burner/*.json"]
      max-file-size: 10240
      max-file-count: 4

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • https://api.github.com/repos/github/gh-aw/git/ref/tags/v0.48.4
    • Triggering command: /usr/bin/gh gh api /repos/github/gh-aw/git/ref/tags/v0.48.4 --jq .object.sha (http block)
  • https://api.github.com/repos/githubnext/agentics/actions/workflows
    • Triggering command: /usr/bin/gh gh workflow list --all --json id,name,path,state (http block)

If you need me to access, download, or install something from one of these locations, you can either:

Original prompt

This section details on the original issue you should resolve

<issue_title>Use the repo memory tool more</issue_title>
<issue_description>The repo assist workflow uses repo memory well to scale over very large collections of issues or other large repo scaling by keeping memory between multiple runs. Realistically an agent can only investigate a handful of issues or PRs in a sinhle run, and in a monorepo or multi library may need to cycle its attention

Tastefully use this repo memory facility across more wotkflows to help with long range scaling and planning over large repos or large collections of issues.

</issue_description>

Comments on the Issue (you are @copilot in this section)


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

…ly-qa workflows

Co-authored-by: dsyme <7204669+dsyme@users.noreply.github.com>
Copilot AI changed the title [WIP] Enhance the use of repo memory tool across workflows Add repo-memory to daily-backlog-burner, daily-perf-improver, and daily-qa workflows Feb 22, 2026
@dsyme dsyme marked this pull request as ready for review February 22, 2026 06:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Use the repo memory tool more

2 participants