Analysing the Robustness of NSGA-II under Noise
- URL: http://arxiv.org/abs/2306.04525v1
- Date: Wed, 7 Jun 2023 15:33:54 GMT
- Title: Analysing the Robustness of NSGA-II under Noise
- Authors: Duc-Cuong Dang, Andre Opris, Bahare Salehi, Dirk Sudholt
- Abstract summary: First runtime analyses of the famous and highly cited EMO algorithm NSGA-II have emerged.
It is unclear how and when NSGA-II can outperform GSEMO.
- Score: 1.7205106391379026
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Runtime analysis has produced many results on the efficiency of simple
evolutionary algorithms like the (1+1) EA, and its analogue called GSEMO in
evolutionary multiobjective optimisation (EMO). Recently, the first runtime
analyses of the famous and highly cited EMO algorithm NSGA-II have emerged,
demonstrating that practical algorithms with thousands of applications can be
rigorously analysed. However, these results only show that NSGA-II has the same
performance guarantees as GSEMO and it is unclear how and when NSGA-II can
outperform GSEMO. We study this question in noisy optimisation and consider a
noise model that adds large amounts of posterior noise to all objectives with
some constant probability $p$ per evaluation. We show that GSEMO fails badly on
every noisy fitness function as it tends to remove large parts of the
population indiscriminately. In contrast, NSGA-II is able to handle the noise
efficiently on \textsc{LeadingOnesTrailingZeroes} when $p<1/2$, as the
algorithm is able to preserve useful search points even in the presence of
noise. We identify a phase transition at $p=1/2$ where the expected time to
cover the Pareto front changes from polynomial to exponential. To our
knowledge, this is the first proof that NSGA-II can outperform GSEMO and the
first runtime analysis of NSGA-II in noisy optimisation.
Related papers
- Illustrating the Efficiency of Popular Evolutionary Multi-Objective Algorithms Using Runtime Analysis [3.748255320979002]
We show that popular evolutionary multi-objective (EMO) algorithms have the same performance guarantee as the simple (G)SEMO algorithm.
We prove that, while GSEMO requires at least $nn$ expected fitness evaluations to optimise OneTrapZeroTrap, popular EMO algorithms NSGA-II, NSGA-III and SMS-EMOA, all enhanced with a mild diversity mechanism of avoiding genotype duplication, only require $O(n log n)$ expected fitness evaluations.
arXiv Detail & Related papers (2024-05-22T12:09:00Z) - Runtime Analyses of NSGA-III on Many-Objective Problems [10.955844285189372]
We present the first runtime analyses of NSGA-III on the popular many-objective benchmark problems mLOTZ, mOMM, and mCOCZ.
We show how these parameters should be scaled with the problem dimension, the number of objectives and the fitness range.
To our knowledge, these are the first runtime analyses for NSGA-III for more than 3 objectives.
arXiv Detail & Related papers (2024-04-17T14:39:14Z) - Already Moderate Population Sizes Provably Yield Strong Robustness to Noise [53.27802701790209]
We show that two evolutionary algorithms can tolerate constant noise probabilities without increasing the runtime on the OneMax benchmark.
Our results are based on a novel proof argument that the noiseless offspring can be seen as a biased uniform crossover between the parent and the noisy offspring.
arXiv Detail & Related papers (2024-04-02T16:35:52Z) - Larger Offspring Populations Help the $(1 + (\lambda, \lambda))$ Genetic
Algorithm to Overcome the Noise [76.24156145566425]
Evolutionary algorithms are known to be robust to noise in the evaluation of the fitness.
We analyze to what extent the $(lambda,lambda)$ genetic algorithm is robust to noise.
arXiv Detail & Related papers (2023-05-08T08:49:01Z) - A Mathematical Runtime Analysis of the Non-dominated Sorting Genetic
Algorithm III (NSGA-III) [9.853329403413701]
The Non-dominated Sorting Genetic Algorithm II (NSGA-II) is the most prominent multi-objective evolutionary algorithm for real-world applications.
We provide the first mathematical runtime analysis of the NSGA-III, a refinement of the NSGA-II aimed at better handling more than two objectives.
arXiv Detail & Related papers (2022-11-15T15:10:36Z) - Clipped Stochastic Methods for Variational Inequalities with
Heavy-Tailed Noise [64.85879194013407]
We prove the first high-probability results with logarithmic dependence on the confidence level for methods for solving monotone and structured non-monotone VIPs.
Our results match the best-known ones in the light-tails case and are novel for structured non-monotone problems.
In addition, we numerically validate that the gradient noise of many practical formulations is heavy-tailed and show that clipping improves the performance of SEG/SGDA.
arXiv Detail & Related papers (2022-06-02T15:21:55Z) - A First Runtime Analysis of the NSGA-II on a Multimodal Problem [17.049516028133958]
This work shows that the NSGA-II copes with the local optima of the OneJumpZeroJump problem at least as well as the global SEMO algorithm.
Overall, this work shows that the NSGA-II copes with the local optima of the OneJumpZeroJump problem at least as well as the global SEMO algorithm.
arXiv Detail & Related papers (2022-04-28T19:44:47Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Mathematical Runtime Analysis for the Non-Dominated Sorting Genetic
Algorithm II (NSGA-II) [16.904475483445452]
We show that runtime analyses are feasible also for the NSGA-II.
We prove that with a population size four times larger than the size of the Pareto front, the NSGA-II satisfies the same runtime guarantees as the SEMO and GSEMO algorithms.
arXiv Detail & Related papers (2021-12-16T03:00:20Z) - Misspecified Gaussian Process Bandit Optimization [59.30399661155574]
Kernelized bandit algorithms have shown strong empirical and theoretical performance for this problem.
We introduce a emphmisspecified kernelized bandit setting where the unknown function can be $epsilon$--uniformly approximated by a function with a bounded norm in some Reproducing Kernel Hilbert Space (RKHS)
We show that our algorithm achieves optimal dependence on $epsilon$ with no prior knowledge of misspecification.
arXiv Detail & Related papers (2021-11-09T09:00:02Z) - GEO: Enhancing Combinatorial Optimization with Classical and Quantum
Generative Models [62.997667081978825]
We introduce a new framework that leverages machine learning models known as generative models to solve optimization problems.
We focus on a quantum-inspired version of GEO relying on tensor-network Born machines.
We show its superior performance when the goal is to find the best minimum given a fixed budget for the number of function calls.
arXiv Detail & Related papers (2021-01-15T18:18:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.