Overcome the Difficulties of NSGA-II via Truthful Crowding Distance with Theoretical Guarantees
- URL: http://arxiv.org/abs/2407.17687v1
- Date: Thu, 25 Jul 2024 01:09:58 GMT
- Title: Overcome the Difficulties of NSGA-II via Truthful Crowding Distance with Theoretical Guarantees
- Authors: Weijie Zheng, Benjamin Doerr,
- Abstract summary: The NSGA-II is proven to encounter difficulties for more than two objectives.
This paper proposes such a variant, called truthful crowding distance.
It is the first NSGA-II variant with a simple structure that performs well for many objectives with theoretical guarantees.
- Score: 14.309243378538014
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The NSGA-II is proven to encounter difficulties for more than two objectives, and the deduced reason is the crowding distance computed by regarding the different objectives independently. The recent theoretical efficiency of the NSGA-III and the SMS-EMOA also supports the deduced reason as both algorithms consider the dependencies of objectives in the second criterion after the non-dominated sorting but with complicated structure or difficult computation. However, there is still a question of whether a simple modification of the original crowding distance can help. This paper proposes such a variant, called truthful crowding distance. This variant inherits the simple structure of summing the component for each objective. For each objective, it first sorts the set of solutions in order of descending objective values, and uses the smallest normalized L1 distance between the current solution and solutions in the earlier positions of the sorted list as the component. Summing up all components gives the value of truthful crowding distance. We call this NSGA-II variant by NSGA-II-T that replaces the original crowding distance with the truthful one, and that sequentially updates the crowding distance value after each removal. We prove that the NSGA-II-T can efficiently cover the full Pareto front for many-objective mOneMinMax and mOJZJ, in contrast to the exponential runtime of the original NSGA-II. Besides, we also prove that it theoretically achieves a slightly better approximation of the Pareto front for OneMinMax than the original NSGA-II with sequential survival selection. Besides, it is the first NSGA-II variant with a simple structure that performs well for many objectives with theoretical guarantees.
Related papers
- Multiobjective Vehicle Routing Optimization with Time Windows: A Hybrid Approach Using Deep Reinforcement Learning and NSGA-II [52.083337333478674]
This paper proposes a weight-aware deep reinforcement learning (WADRL) approach designed to address the multiobjective vehicle routing problem with time windows (MOVRPTW)
The Non-dominated sorting genetic algorithm-II (NSGA-II) method is then employed to optimize the outcomes produced by the WADRL.
arXiv Detail & Related papers (2024-07-18T02:46:06Z) - Runtime Analyses of NSGA-III on Many-Objective Problems [10.955844285189372]
We present the first runtime analyses of NSGA-III on the popular many-objective benchmark problems mLOTZ, mOMM, and mCOCZ.
We show how these parameters should be scaled with the problem dimension, the number of objectives and the fitness range.
To our knowledge, these are the first runtime analyses for NSGA-III for more than 3 objectives.
arXiv Detail & Related papers (2024-04-17T14:39:14Z) - NICE: Improving Panoptic Narrative Detection and Segmentation with
Cascading Collaborative Learning [77.95710025273218]
We propose a unified framework called NICE that can jointly learn two panoptic narrative recognition tasks.
By linking PNS and PND in series with the barycenter of segmentation as the anchor, our approach naturally aligns the two tasks.
NICE surpasses all existing methods by a large margin, achieving 4.1% for PND and 2.9% for PNS over the state-of-the-art.
arXiv Detail & Related papers (2023-10-17T03:42:12Z) - Runtime Analysis for the NSGA-II: Proving, Quantifying, and Explaining
the Inefficiency For Many Objectives [16.904475483445452]
The NSGA-II is one of the most prominent algorithms to solve multi-objective optimization problems.
We show that the NSGA-II is less effective for larger numbers of objectives.
arXiv Detail & Related papers (2022-11-23T16:15:26Z) - A Mathematical Runtime Analysis of the Non-dominated Sorting Genetic
Algorithm III (NSGA-III) [9.853329403413701]
The Non-dominated Sorting Genetic Algorithm II (NSGA-II) is the most prominent multi-objective evolutionary algorithm for real-world applications.
We provide the first mathematical runtime analysis of the NSGA-III, a refinement of the NSGA-II aimed at better handling more than two objectives.
arXiv Detail & Related papers (2022-11-15T15:10:36Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - From Understanding the Population Dynamics of the NSGA-II to the First
Proven Lower Bounds [10.107150356618588]
We prove that the NSGA-II with suitable population size needs $Omega(Nnlog n)$ function evaluations.
For the computation of the crowding distance contributions of the two objectives, we even obtain a runtime estimate that is tight including the leading constant.
arXiv Detail & Related papers (2022-09-28T10:11:20Z) - Lookback for Learning to Branch [77.32867454769936]
Bipartite Graph Neural Networks (GNNs) have been shown to be an important component of deep learning based Mixed-Integer Linear Program (MILP) solvers.
Recent works have demonstrated the effectiveness of such GNNs in replacing the branching (variable selection) in branch-and-bound (B&B) solvers.
arXiv Detail & Related papers (2022-06-30T02:33:32Z) - Approximation Guarantees for the Non-Dominated Sorting Genetic Algorithm
II (NSGA-II) [16.904475483445452]
We study how well the NSGA-II approximates the Pareto front when the population size is smaller.
This is the first mathematical work on the approximation ability of the NSGA-II and the first runtime analysis for the steady-state NSGA-II.
arXiv Detail & Related papers (2022-03-05T09:53:30Z) - Efficient Person Search: An Anchor-Free Approach [86.45858994806471]
Person search aims to simultaneously localize and identify a query person from realistic, uncropped images.
To achieve this goal, state-of-the-art models typically add a re-id branch upon two-stage detectors like Faster R-CNN.
In this work, we present an anchor-free approach to efficiently tackling this challenging task, by introducing the following dedicated designs.
arXiv Detail & Related papers (2021-09-01T07:01:33Z) - GEO: Enhancing Combinatorial Optimization with Classical and Quantum
Generative Models [62.997667081978825]
We introduce a new framework that leverages machine learning models known as generative models to solve optimization problems.
We focus on a quantum-inspired version of GEO relying on tensor-network Born machines.
We show its superior performance when the goal is to find the best minimum given a fixed budget for the number of function calls.
arXiv Detail & Related papers (2021-01-15T18:18:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.