Runtime Analyses of Multi-Objective Evolutionary Algorithms in the
Presence of Noise
- URL: http://arxiv.org/abs/2305.10259v3
- Date: Tue, 30 May 2023 15:45:51 GMT
- Title: Runtime Analyses of Multi-Objective Evolutionary Algorithms in the
Presence of Noise
- Authors: Matthieu Dinot, Benjamin Doerr, Ulysse Hennebelle, Sebastian Will
- Abstract summary: We conduct the first analysis of a classic benchmark in the presence of noise in the objective functions.
We prove that when bit-wise prior noise with rate $p le alpha/n$, $alpha$ a suitable constant is present.
We prove that when all solutions are reevaluated in each iteration, then any noise rate $p = omega(log(n)/n2)$ leads to a super-polynomial runtime.
- Score: 7.421135890148154
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In single-objective optimization, it is well known that evolutionary
algorithms also without further adjustments can tolerate a certain amount of
noise in the evaluation of the objective function. In contrast, this question
is not at all understood for multi-objective optimization.
In this work, we conduct the first mathematical runtime analysis of a simple
multi-objective evolutionary algorithm (MOEA) on a classic benchmark in the
presence of noise in the objective functions. We prove that when bit-wise prior
noise with rate $p \le \alpha/n$, $\alpha$ a suitable constant, is present, the
\emph{simple evolutionary multi-objective optimizer} (SEMO) without any
adjustments to cope with noise finds the Pareto front of the OneMinMax
benchmark in time $O(n^2\log n)$, just as in the case without noise. Given that
the problem here is to arrive at a population consisting of $n+1$ individuals
witnessing the Pareto front, this is a surprisingly strong robustness to noise
(comparably simple evolutionary algorithms cannot optimize the single-objective
OneMax problem in polynomial time when $p = \omega(\log(n)/n)$). Our proofs
suggest that the strong robustness of the MOEA stems from its implicit
diversity mechanism designed to enable it to compute a population covering the
whole Pareto front.
Interestingly this result only holds when the objective value of a solution
is determined only once and the algorithm from that point on works with this,
possibly noisy, objective value. We prove that when all solutions are
reevaluated in each iteration, then any noise rate $p = \omega(\log(n)/n^2)$
leads to a super-polynomial runtime. This is very different from
single-objective optimization, where it is generally preferred to reevaluate
solutions whenever their fitness is important and where examples are known such
that not reevaluating solutions can lead to catastrophic performance losses.
Related papers
- Evolutionary Algorithms Are Significantly More Robust to Noise When They Ignore It [10.165640083594573]
We show the need for re-evaluations could be overestimated, and in fact, detrimental.
This first analysis of an evolutionary algorithm solving a single-objective noisy problem without re-evaluations could indicate that such algorithms cope with noise much better than previously thought.
arXiv Detail & Related papers (2024-08-31T00:10:10Z) - Already Moderate Population Sizes Provably Yield Strong Robustness to Noise [53.27802701790209]
We show that two evolutionary algorithms can tolerate constant noise probabilities without increasing the runtime on the OneMax benchmark.
Our results are based on a novel proof argument that the noiseless offspring can be seen as a biased uniform crossover between the parent and the noisy offspring.
arXiv Detail & Related papers (2024-04-02T16:35:52Z) - Some Constructions of Private, Efficient, and Optimal $K$-Norm and Elliptic Gaussian Noise [54.34628844260993]
Differentially private computation often begins with a bound on some $d$-dimensional statistic's sensitivity.
For pure differential privacy, the $K$-norm mechanism can improve on this approach using a norm tailored to the statistic's sensitivity space.
This paper solves both problems for the simple statistics of sum, count, and vote.
arXiv Detail & Related papers (2023-09-27T17:09:36Z) - The First Proven Performance Guarantees for the Non-Dominated Sorting
Genetic Algorithm II (NSGA-II) on a Combinatorial Optimization Problem [6.793248433673384]
The Non-dominated Sorting Genetic Algorithm-II (NSGA-II) is one of the most prominent algorithms to solve multi-objective optimization problems.
We give the first proven performance guarantees for a classic optimization problem, the NP-complete bi-objective minimum spanning tree problem.
arXiv Detail & Related papers (2023-05-22T19:59:19Z) - Larger Offspring Populations Help the $(1 + (\lambda, \lambda))$ Genetic
Algorithm to Overcome the Noise [76.24156145566425]
Evolutionary algorithms are known to be robust to noise in the evaluation of the fitness.
We analyze to what extent the $(lambda,lambda)$ genetic algorithm is robust to noise.
arXiv Detail & Related papers (2023-05-08T08:49:01Z) - Stochastic Nonsmooth Convex Optimization with Heavy-Tailed Noises:
High-Probability Bound, In-Expectation Rate and Initial Distance Adaptation [22.758674468435302]
In a heavy-tailed noise regime, the difference between the gradient and the true rate is assumed to have a finite $p-th moment.
This paper provides a comprehensive analysis of nonsmooth convex optimization with heavy-tailed noises.
arXiv Detail & Related papers (2023-03-22T03:05:28Z) - The $(1+(\lambda,\lambda))$ Global SEMO Algorithm [8.34061303235504]
We show that the $(lambda,lambda))$ genetic computation can be transported to multi-objective evolutionary computation.
We define the $(lambda,lambda))$ global SEMO algorithm, a variant of the classic global SEMO algorithm, and prove that it optimize the OneMinMax algorithm benchmarkally faster than the global SEMO.
arXiv Detail & Related papers (2022-10-07T15:18:32Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - Minimax Optimization with Smooth Algorithmic Adversaries [59.47122537182611]
We propose a new algorithm for the min-player against smooth algorithms deployed by an adversary.
Our algorithm is guaranteed to make monotonic progress having no limit cycles, and to find an appropriate number of gradient ascents.
arXiv Detail & Related papers (2021-06-02T22:03:36Z) - A Two-Timescale Framework for Bilevel Optimization: Complexity Analysis
and Application to Actor-Critic [142.1492359556374]
Bilevel optimization is a class of problems which exhibit a two-level structure.
We propose a two-timescale approximation (TTSA) algorithm for tackling such a bilevel problem.
We show that a two-timescale natural actor-critic policy optimization algorithm can be viewed as a special case of our TTSA framework.
arXiv Detail & Related papers (2020-07-10T05:20:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.