Evolutionary Algorithms Are Significantly More Robust to Noise When They Ignore It
- URL: http://arxiv.org/abs/2409.00306v1
- Date: Sat, 31 Aug 2024 00:10:10 GMT
- Title: Evolutionary Algorithms Are Significantly More Robust to Noise When They Ignore It
- Authors: Denis Antipov, Benjamin Doerr,
- Abstract summary: We show the need for re-evaluations could be overestimated, and in fact, detrimental.
This first analysis of an evolutionary algorithm solving a single-objective noisy problem without re-evaluations could indicate that such algorithms cope with noise much better than previously thought.
- Score: 10.165640083594573
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Randomized search heuristics (RHSs) are generally believed to be robust to noise. However, almost all mathematical analyses on how RSHs cope with a noisy access to the objective function assume that each solution is re-evaluated whenever it is compared to others. This is unfortunate, both because it wastes computational resources and because it requires the user to foresee that noise is present (as in a noise-free setting, one would never re-evaluate solutions). In this work, we show the need for re-evaluations could be overestimated, and in fact, detrimental. For the classic benchmark problem of how the $(1+1)$ evolutionary algorithm optimizes the LeadingOnes benchmark, we show that without re-evaluations up to constant noise rates can be tolerated, much more than the $O(n^{-2} \log n)$ noise rates that can be tolerated when re-evaluating solutions. This first runtime analysis of an evolutionary algorithm solving a single-objective noisy problem without re-evaluations could indicate that such algorithms cope with noise much better than previously thought, and without the need to foresee the presence of noise.
Related papers
- Already Moderate Population Sizes Provably Yield Strong Robustness to Noise [53.27802701790209]
We show that two evolutionary algorithms can tolerate constant noise probabilities without increasing the runtime on the OneMax benchmark.
Our results are based on a novel proof argument that the noiseless offspring can be seen as a biased uniform crossover between the parent and the noisy offspring.
arXiv Detail & Related papers (2024-04-02T16:35:52Z) - Some Constructions of Private, Efficient, and Optimal $K$-Norm and Elliptic Gaussian Noise [54.34628844260993]
Differentially private computation often begins with a bound on some $d$-dimensional statistic's sensitivity.
For pure differential privacy, the $K$-norm mechanism can improve on this approach using a norm tailored to the statistic's sensitivity space.
This paper solves both problems for the simple statistics of sum, count, and vote.
arXiv Detail & Related papers (2023-09-27T17:09:36Z) - Runtime Analyses of Multi-Objective Evolutionary Algorithms in the
Presence of Noise [7.421135890148154]
We conduct the first analysis of a classic benchmark in the presence of noise in the objective functions.
We prove that when bit-wise prior noise with rate $p le alpha/n$, $alpha$ a suitable constant is present.
We prove that when all solutions are reevaluated in each iteration, then any noise rate $p = omega(log(n)/n2)$ leads to a super-polynomial runtime.
arXiv Detail & Related papers (2023-05-17T14:48:06Z) - Larger Offspring Populations Help the $(1 + (\lambda, \lambda))$ Genetic
Algorithm to Overcome the Noise [76.24156145566425]
Evolutionary algorithms are known to be robust to noise in the evaluation of the fitness.
We analyze to what extent the $(lambda,lambda)$ genetic algorithm is robust to noise.
arXiv Detail & Related papers (2023-05-08T08:49:01Z) - Higher degree sum-of-squares relaxations robust against oblivious
outliers [14.58610686291355]
We consider estimation models of the form $Y=X*+N$, where $X*$ is some $m$-dimensional signal we wish to recover.
We introduce a family of algorithms that under mild assumptions recover the signal $X*$ in all estimation problems for which there exists a sum-of-squares algorithm.
arXiv Detail & Related papers (2022-11-14T13:09:12Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - ReLU Regression with Massart Noise [52.10842036932169]
We study the fundamental problem of ReLU regression, where the goal is to fit Rectified Linear Units (ReLUs) to data.
We focus on ReLU regression in the Massart noise model, a natural and well-studied semi-random noise model.
We develop an efficient algorithm that achieves exact parameter recovery in this model.
arXiv Detail & Related papers (2021-09-10T02:13:22Z) - Learning based signal detection for MIMO systems with unknown noise
statistics [84.02122699723536]
This paper aims to devise a generalized maximum likelihood (ML) estimator to robustly detect signals with unknown noise statistics.
In practice, there is little or even no statistical knowledge on the system noise, which in many cases is non-Gaussian, impulsive and not analyzable.
Our framework is driven by an unsupervised learning approach, where only the noise samples are required.
arXiv Detail & Related papers (2021-01-21T04:48:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.