First off, thank you for this handy library. It has been a valuable tool in my research and experimentation.
I’d like to propose an enhancement to the library: support for multi-objective optimization (MOO) algorithms, particularly multi-objective evolutionary algorithms (MOEAs).
Currently, evosax is limited to single-objective optimization methods. However, many real-world problems involve multiple conflicting objectives, and solving them requires algorithms that can explore a Pareto-optimal set of solutions.
Adding support for MOO would:
- Address a gap in many general-purpose optimization libraries.
- Attract researchers and practitioners from fields such as multi-task learning, engineering design, neural architecture search, and more.
- Extend the utility of the library beyond single-objective problems, making it more versatile and competitive with libraries like pymoo.
Suggested Additions
Popular MOEAs such as:
- NSGA-II (for $M\lt3$)
- NSGA-III (for $M\geq 3$)
- MOCMAES
- GDE3
- MOPSO
- SPEA2
- MOEA/D
- Pareto-front analysis tools (visualization, hypervolume, IGD+, spread metrics, etc.).
I’d be happy to help with implementation or testing if this feature aligns with your development roadmap.
First off, thank you for this handy library. It has been a valuable tool in my research and experimentation.
I’d like to propose an enhancement to the library: support for multi-objective optimization (MOO) algorithms, particularly multi-objective evolutionary algorithms (MOEAs).
Currently, evosax is limited to single-objective optimization methods. However, many real-world problems involve multiple conflicting objectives, and solving them requires algorithms that can explore a Pareto-optimal set of solutions.
Adding support for MOO would:
Suggested Additions
Popular MOEAs such as:
I’d be happy to help with implementation or testing if this feature aligns with your development roadmap.