Skip to content

Libmoon is a flexible and extensible multi-objective optimization platform. Our goal is to make MOO great again. We released libmoon not because developing MOO is easy, but because it is hard.

License

Notifications You must be signed in to change notification settings

xzhang2523/libmoon

Repository files navigation

Moon: A Standardized/Flexible Framework for MultiObjective OptimizatioN

Moon

Moon: A Multiobjective Optimization Framework

Introduction

Moon is a multiobjective optimization framework that spans from single-objective optimization to multiobjective optimization. It aims to enhance the understanding of optimization problems and facilitate fair comparisons between MOO algorithms. A submission to NeurIPS 2024 DB track.

"I raise my cup to invite the moon.
With my shadow we become three from one."
-- Li Bai

Main Contributors

  • Xiaoyuan Zhang (Maintainer of Pareto set learning, gradient-based solver)
  • Ji Cheng
  • Liao Zhao (Maintainer of MOBO)
  • Weiduo Liao
  • Zhe Zhao
  • Xi Lin
  • Cheng Gong
  • Longcan Chen
  • YingYing Yu

Advisory Board

  • Prof. Jingda Deng (Xi'an Jiaotong University) (For advice of High-D hypervolume computation)
  • Prof. Yifan Chen (Hong Kong Baptist University) (For advice of OR)
  • Prof. Ke Shang (Shenzhen University) (For advice of approximate hypervolume-based methods)
  • Prof. Han Zhao (University of Illinois at Urbana-Champaign) (For advice of fariness classification)

Correspondence

The corresponding author is Chair Prof. Qingfu Zhang (FIEEE, City University of Hong Kong).

Contact

Resources

For more information on methodologies, please visit our GitHub repository. Contributions and stars are welcome!

(1) A standardlized gradient based framework.

Optimization Problem Classes

Problem Class Details

For more information on problem specifics, please refer to the Readme_problem.md file.

Synthetic Problems

Here's a list of synthetic problems along with relevant research papers and project/code links:

Problem Paper Project/Code
ZDT Paper Project
DTLZ Paper Project
MAF Paper Project
WFG Paper Code
Fi's Paper Code
RE Paper Code

Multitask Learning Problems

This section details problems related to multitask learning, along with their corresponding papers and project/code references:

Problem Paper Project/Code
MO-MNISTs PMTL COSMOS
Fairness Classification COSMOS COSMOS
Federated Learning Federal MTL COSMOS
Synthetic (DST, FTS...) Envelop Project
Robotics (MO-MuJoCo...) PGMORL Code
  • Gradient-based Solver.

    Method Property #Obj Support Published Complexity
    EPO code Exact solution. Any Y ICML 2020 $O(m^2 n K )$
    COSMOS code Approximated exact solution. Any Y ICDM 2021 $O(m n K )$
    MOO-SVGD code A set of diverse Pareto solution. Any Y NeurIPS 2021 $O(m^2 n K^2 )$
    MGDA code Arbitray Pareto solutions. Location affected highly by initialization. Any Y NeurIPS 2018 $O(m^2 n K )$
    PMTL code Pareto solutions in sectors. 2. 3 is difficult. Y NeurIPS 2019 $O(m^2 n K^2 )$
    PMGDA Pareto solutions satisfying any preference. Any Y Under review $O(m^2 n K )$
    GradienHV WangHao code It is a gradient-based HV method. 2/3 Y CEC 2023 $O(m^2 n K^2 )$
    Aggregation fun. based, e.g. Tche,mTche,LS,PBI,... Pareto solution with aggregations. Any Y

    Here, $m$ is the number of objectives, $K$ is the number of samples, and $n$ is the number of decision variables. For neural network based methods, $n$ is the number of parameters; hence $n$ is very large (>10000), K is also large (e.g., 20-50), while $m$ is small (2.g., 2-4).

    As a result, m^2 is not a big problem. n^2 is a big problem. K^2 is a big problem.

    Time complexity of gradient based methods are as follows, -1 Tier 1. GradAggSolver. -2 Tier 2. MGDASolver, EPOSolver, PMTLSolver. -3 Tier 3. GradHVSolver -4 Tier 4. MOOSVGDSolver

    Current support: GradAggSolver, MGDASolver, EPOSolver, MOOSVGDSolver, GradHVSolver, PMTLSolver.

    Important things to notice: The original code MOO-SVGD does not offer a MTL implement. Our code is the first open source code for MTL MOO-SVGD.

Supported Solvers

Current Support

Libmoon includes a variety of solvers tailored for different needs:

  • GradAggSolver
  • MGDASolver
  • EPOSolver
  • MOOSVGDSolver (*)
  • GradHVSolver
  • PMTLSolver

(*) The original MOO-SVGD code does not include an implementation for Multitask Learning (MTL). Our release of MOO-SVGD is the first open-source code that supports MTL.

PSL (Pareto set learning) Solvers

Libmoon supports various models of PSL solvers, categorized as follows:

  • EPO-based PSL model
  • Agg-based PSL model
  • Hypernetwork-based PSL model
  • ConditionalNet-based PSL model
  • Simple PSL model
  • Generative PSL model

MOEA/D Framework

Currently Supported

Upcoming Releases

ML Pretrained Methods

  • HV Net, a model for handling high-volume data, available here.

Installation

Libmoon is available on PyPI. You can install it using pip:

pip install libmoon==0.1.11


Example code for a synthetic problem,

Four lines of code running on a synthetic problem.

from libmoon.solver.gradient.methods import EPOSolver
from libmoon.util_global.initialization import synthetic_init
from libmoon.util_global.weight_factor import uniform_pref

problem = get_problem(problem_name='ZDT1')
prefs = uniform_pref(n_prob=5, n_obj = problem.n_obj, clip_eps=1e-2)
solver = EPOSolver(problem, step_size=1e-2, n_iter=1000, tol=1e-2)
res = solver.solve(x=synthetic_init(problem, prefs), prefs=prefs)

Example of MTL



About

Libmoon is a flexible and extensible multi-objective optimization platform. Our goal is to make MOO great again. We released libmoon not because developing MOO is easy, but because it is hard.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published