Skip to content

Add support for adamw-type optimizers#99

Merged
RobertTLange merged 4 commits intoRobertTLange:mainfrom
TheodoreWolf:theo/add-adamw-support
Apr 2, 2026
Merged

Add support for adamw-type optimizers#99
RobertTLange merged 4 commits intoRobertTLange:mainfrom
TheodoreWolf:theo/add-adamw-support

Conversation

@TheodoreWolf
Copy link
Copy Markdown
Contributor

@TheodoreWolf TheodoreWolf commented Nov 10, 2025

Hi there, thanks for the wonderful library!

I noticed that there was no support for adamw-type optimizers, which is a shame as regularization is known to help find more stable minima.

I wanted to add these, it is a simple change: adding the state.mean as an argument when calling update on the optimizer. Optax's API supports this and your core.optimizer also does. So this doesn't break anything (hopefully). All tests have passed. I was thinkning about adding a specific test for this but I couldn't find a nice way to insert this given your current test structure. I've opted for simply adding adamw in the rl notebook. Let me know if you want something different.

@RobertTLange
Copy link
Copy Markdown
Owner

Thanks, Theo. I added a real regression test covering an actual optax.adamw ask/eval/tell loop, and I also tightened the dependency window plus CI config so the existing matrix can run reliably again. Once this rerun finishes green, I will merge it.

@RobertTLange RobertTLange merged commit 1de5b67 into RobertTLange:main Apr 2, 2026
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants