Skip to content

Add non-record 10min/16MB submission: Wavelet-Lite PR549 Parallel Muon (1.1483)#680

Open
bro4all wants to merge 1 commit intoopenai:mainfrom
bro4all:submission/2026-03-24-wavelet-lite-pr549-nonrecord
Open

Add non-record 10min/16MB submission: Wavelet-Lite PR549 Parallel Muon (1.1483)#680
bro4all wants to merge 1 commit intoopenai:mainfrom
bro4all:submission/2026-03-24-wavelet-lite-pr549-nonrecord

Conversation

@bro4all
Copy link

@bro4all bro4all commented Mar 25, 2026

Summary

This PR adds a non-record track_10min_16mb submission under:

  • records/track_10min_16mb/2026-03-24_WaveletLite_PR549_ParallelMuon/

The submission is a PR #549-derived Parallel Muon stack with one architectural change: a tiny causal wavelet-lite mixer inside each residual block.

Final result

  • Exact saved-artifact roundtrip: val_bpb=1.14825550
  • Total submission size: 15,859,711 bytes
  • Margin under 16,000,000-byte cap: 140,289 bytes
  • 8xH100 training pace: 90.24 ms/step
  • Post-EMA diagnostic at train-time wallclock cap: val_bpb=1.1400

Why submit as non-record

This does not beat the current SOTA, so this is intentionally submitted as a non-record run under the standard 10min/16MB track.

Why it is not duplicate work

Closest prior work is PR #549, but this submission adds a new in-block causal wavelet mixer and removes TTT from the final run while trimming the bigram table to fit the byte budget.

Additional nearby prior work addressed in the README:

Included files

Per the repo submission rules, this PR only adds a new folder with:

  • README.md
  • submission.json
  • train_gpt.py
  • final_model.int6.ptz
  • training log
  • exact roundtrip eval log
  • local results.tsv snapshot

Notes

  • The final artifact and exact roundtrip eval were recovered from a persisted full-precision checkpoint after the training pod exited, and both logs are included in the submission folder.
  • The README documents the full Runpod recipe, nearest prior PRs, exact structural differences, and failed intermediate attempts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant