Add DE mutation dithering#104
Open
githubpsyche wants to merge 1 commit intoRobertTLange:mainfrom
Open
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Differential Evolution works by keeping a population of candidate solutions and repeatedly proposing new trial candidates from that population. A central step is mutation: take the difference between two population members, use that difference as a direction to move in parameter space, and add the scaled difference to a base candidate. In evosax's implementation,
differential_weightcontrols that scale.SciPy's
differential_evolutionexposes mutation dithering through its mutation argument: instead of passing one fixed mutation constant, users can pass a(min, max)tuple. SciPy then samples one differential weight from that range per generation. This lets different generations use different mutation scales while keeping the scale shared across the trial population for that generation. I've found in my work that carrying over this feature from scipy to evosax helps find better fits more consistently. By contrast, with a fixeddifferential_weight, I had a harder time matching SciPy DE fits under similar iteration budgets. After exploring other angles for achieving parity, dithering was the SciPy-side feature that made the comparison meaningfully closer.This PR adds
differential_weight_minanddifferential_weight_maxtoDifferentialEvolution. Defaults match the current fixed setting, so existingdifferential_weightbehavior is unchanged. Whendifferential_weight_min < differential_weight_max,_asksamples one generation-level differential weight fromU[min, max).I added population-based tests for unchanged defaults, active dithering, finite candidate generation, and scan-style usage.