Skip to content

Add DE mutation dithering#104

Open
githubpsyche wants to merge 1 commit intoRobertTLange:mainfrom
githubpsyche:add-de-dithering
Open

Add DE mutation dithering#104
githubpsyche wants to merge 1 commit intoRobertTLange:mainfrom
githubpsyche:add-de-dithering

Conversation

@githubpsyche
Copy link
Copy Markdown

Differential Evolution works by keeping a population of candidate solutions and repeatedly proposing new trial candidates from that population. A central step is mutation: take the difference between two population members, use that difference as a direction to move in parameter space, and add the scaled difference to a base candidate. In evosax's implementation, differential_weight controls that scale.

SciPy's differential_evolution exposes mutation dithering through its mutation argument: instead of passing one fixed mutation constant, users can pass a (min, max) tuple. SciPy then samples one differential weight from that range per generation. This lets different generations use different mutation scales while keeping the scale shared across the trial population for that generation. I've found in my work that carrying over this feature from scipy to evosax helps find better fits more consistently. By contrast, with a fixed differential_weight, I had a harder time matching SciPy DE fits under similar iteration budgets. After exploring other angles for achieving parity, dithering was the SciPy-side feature that made the comparison meaningfully closer.

This PR adds differential_weight_min and differential_weight_max to DifferentialEvolution. Defaults match the current fixed setting, so existing differential_weight behavior is unchanged. When differential_weight_min < differential_weight_max, _ask samples one generation-level differential weight from U[min, max).

I added population-based tests for unchanged defaults, active dithering, finite candidate generation, and scan-style usage.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant