Skip to content

[Priority 2] Implement incremental context streaming (Context Updater pattern) #62

@marknutter

Description

@marknutter

Problem

Current recall uses static snapshots — load all matching entries at once. This doesn't scale well:

  • Query with 50+ matches → 50+ subagent dispatches, even if top 5 answer the question
  • No support for long-running iterative analyses (drill deeper based on findings)
  • No dynamic refresh (swap stale entries for better ones mid-session)

Proposal (from AIGNE paper analysis)

Add ContextStream class that manages dynamic context loading:

from rlm.updater import ContextStream

stream = ContextStream(session_id='abc123')
stream.load_initial(top_entries=5)  # Start with highest-scoring

while not analysis_complete:
    findings = dispatch_subagents(stream.current_context)
    if stream.needs_refresh(findings):
        stream.swap(remove=['stale_entry'], add=['deeper_dive_entry'])
    stream.log_context_state()  # Full provenance

Three modes:

  • Static snapshot — one-time load (current behavior)
  • Incremental streaming — progressive loading as reasoning unfolds
  • Adaptive refresh — replace stale/irrelevant fragments based on model feedback

Implementation

  1. Create rlm/updater.py with ContextStream class
  2. Integrate with graduated dispatch (use incremental mode automatically when >10 matches)
  3. Log all context state transitions to provenance log
  4. Add streaming support to recall pipeline

Impact

  • Resource efficiency (only load what's needed)
  • Supports long-running iterative analyses
  • Better user experience (faster results for queries with clear top-ranked answers)
  • Full traceability of context evolution

Effort

3-4 days

Related

  • Context Updater from 'Everything is Context' paper
  • Graduated dispatch (already implemented)
  • Provenance logging (dependency)

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions