Skip to content

Record: Cosine TTT + Multi-Order N-gram Cache (3-seed mean val_bpb=0.9850)#741

Open
andrewbaggio1 wants to merge 1 commit intoopenai:mainfrom
andrewbaggio1:submission/ngram-cache-sub1
Open

Record: Cosine TTT + Multi-Order N-gram Cache (3-seed mean val_bpb=0.9850)#741
andrewbaggio1 wants to merge 1 commit intoopenai:mainfrom
andrewbaggio1:submission/ngram-cache-sub1

Conversation

@andrewbaggio1
Copy link

@andrewbaggio1 andrewbaggio1 commented Mar 25, 2026

Summary

3-seed mean val_bpb: 0.9850 (std=0.0011) | 15.62 MB artifact | 8xH100 SXM

First submission to combine cosine TTT with multi-order n-gram cache interpolation, breaking the sub-1.0 BPB barrier.

Results (8xH100 SXM)

Seed val_bpb
1337 0.9842
42 0.9862
7 0.9846
Mean ± Std 0.9850 ± 0.0011

Approach

Two-phase eval, each using an independently legal technique:

Phase 1 — Cosine TTT (20ep, ~330s): Single-pass AdamW with cosine LR + per-layer LR groups. Same approach as merged PR #549.

Phase 2 — N-gram Cache (~150s): Sliding-window eval with multi-order backoff (2-5gram) and entropy-adaptive alpha interpolation. Same approach as PR #702 (open, zero reviewer objections). p_mixed = (1-a)*p_model + a*p_ngram — single blended prediction per token, no min(NLL).

Total eval: ~517s (within 10-min budget).

Legality

  • No min(NLL) — each token gets exactly one prediction
  • Score-first n-gram cache — tokens scored before cache update
  • Entropy-adaptive alpha — depends on model uncertainty, NOT target token
  • Single-pass TTT — no multi-pass trajectory selection
  • Both techniques independently accepted/pending without objection

Credits

PR #518 (arch), PR #702 (n-gram concept), PR #481, #442, #398

Test plan

  • train_gpt.py compiles
  • 3 seeds, all artifacts < 16 MB
  • Training < 10 min, eval < 10 min
  • No min(NLL), no target-aware gating
  • PR only adds one folder

🤖 Generated with Claude Code

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
agalimova added a commit to agalimova/parameter-golf that referenced this pull request Mar 25, 2026
…ash4K)

Built on PR openai#741 with hyperparameter improvements found via
autoresearch-multi combinatorial search:
- XSA_LAST_N=6, BIGRAM_VOCAB_SIZE=4096, NGRAM_ORDER=7, NGRAM_ALPHA_HIGH=0.50

2-seed mean: 0.9258 (seeds 1337=0.9249, 42=0.9266)
Eval time: ~520s (under 10-min budget)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant