Exponential Upper Bounds for the Runtime of Randomized Search Heuristics
- URL: http://arxiv.org/abs/2004.05733v4
- Date: Sat, 9 Oct 2021 17:04:44 GMT
- Title: Exponential Upper Bounds for the Runtime of Randomized Search Heuristics
- Authors: Benjamin Doerr
- Abstract summary: We show that any of the algorithms randomized local search, algorithm, simulated annealing, and (1+1) evolutionary algorithm can optimize any pseudo-Boolean weakly monotonic function.
We also prove an exponential upper bound for the runtime of the mutation-based version of the simple genetic algorithm on the OneMax benchmark.
- Score: 9.853329403413701
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We argue that proven exponential upper bounds on runtimes, an established
area in classic algorithms, are interesting also in heuristic search and we
prove several such results. We show that any of the algorithms randomized local
search, Metropolis algorithm, simulated annealing, and (1+1) evolutionary
algorithm can optimize any pseudo-Boolean weakly monotonic function under a
large set of noise assumptions in a runtime that is at most exponential in the
problem dimension~$n$. This drastically extends a previous such result, limited
to the (1+1) EA, the LeadingOnes function, and one-bit or bit-wise prior noise
with noise probability at most $1/2$, and at the same time simplifies its
proof. With the same general argument, among others, we also derive a
sub-exponential upper bound for the runtime of the $(1,\lambda)$ evolutionary
algorithm on the OneMax problem when the offspring population size $\lambda$ is
logarithmic, but below the efficiency threshold. To show that our approach can
also deal with non-trivial parent population sizes, we prove an exponential
upper bound for the runtime of the mutation-based version of the simple genetic
algorithm on the OneMax benchmark, matching a known exponential lower bound.
Related papers
- Already Moderate Population Sizes Provably Yield Strong Robustness to Noise [53.27802701790209]
We show that two evolutionary algorithms can tolerate constant noise probabilities without increasing the runtime on the OneMax benchmark.
Our results are based on a novel proof argument that the noiseless offspring can be seen as a biased uniform crossover between the parent and the noisy offspring.
arXiv Detail & Related papers (2024-04-02T16:35:52Z) - Fast Minimization of Expected Logarithmic Loss via Stochastic Dual
Averaging [8.990961435218544]
We propose a first-order algorithm named $B$-sample dual averaging with the logarithmic barrier.
For the Poisson inverse problem, our algorithm attains an $varepsilon$ solution in $smashtildeO(d3/varepsilon2)$ time.
When computing the maximum-likelihood estimate for quantum state tomography, our algorithm yields an $varepsilon$-optimal solution in $smashtildeO(d3/varepsilon2)$ time.
arXiv Detail & Related papers (2023-11-05T03:33:44Z) - Larger Offspring Populations Help the $(1 + (\lambda, \lambda))$ Genetic
Algorithm to Overcome the Noise [76.24156145566425]
Evolutionary algorithms are known to be robust to noise in the evaluation of the fitness.
We analyze to what extent the $(lambda,lambda)$ genetic algorithm is robust to noise.
arXiv Detail & Related papers (2023-05-08T08:49:01Z) - Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee [86.05440220344755]
We propose and analyze inexact regularized Newton-type methods for finding a global saddle point of emphcon unconstrained min-max optimization problems.
We show that the proposed methods generate iterates that remain within a bounded set and that the iterations converge to an $epsilon$-saddle point within $O(epsilon-2/3)$ in terms of a restricted function.
arXiv Detail & Related papers (2022-10-23T21:24:37Z) - Gaussian Process Bandit Optimization with Few Batches [49.896920704012395]
We introduce a batch algorithm inspired by finite-arm bandit algorithms.
We show that it achieves the cumulative regret upper bound $Oast(sqrtTgamma_T)$ using $O(loglog T)$ batches within time horizon $T$.
In addition, we propose a modified version of our algorithm, and characterize how the regret is impacted by the number of batches.
arXiv Detail & Related papers (2021-10-15T00:54:04Z) - Classical algorithms and quantum limitations for maximum cut on
high-girth graphs [6.262125516926276]
We show that every (quantum or classical) one-local algorithm achieves on $D$-regular graphs of $> 5$ a maximum cut of at most $1/2 + C/sqrtD$ for $C=1/sqrt2 approx 0.7071$.
We show that there is a classical $k$-local algorithm that achieves a value of $1/2 + C/sqrtD - O (1/sqrtk)$ for $D$-regular graphs of $> 2k+1$, where
arXiv Detail & Related papers (2021-06-10T16:28:23Z) - ROOT-SGD: Sharp Nonasymptotics and Near-Optimal Asymptotics in a Single Algorithm [71.13558000599839]
We study the problem of solving strongly convex and smooth unconstrained optimization problems using first-order algorithms.
We devise a novel, referred to as Recursive One-Over-T SGD, based on an easily implementable, averaging of past gradients.
We prove that it simultaneously achieves state-of-the-art performance in both a finite-sample, nonasymptotic sense and an sense.
arXiv Detail & Related papers (2020-08-28T14:46:56Z) - Exploiting Higher Order Smoothness in Derivative-free Optimization and
Continuous Bandits [99.70167985955352]
We study the problem of zero-order optimization of a strongly convex function.
We consider a randomized approximation of the projected gradient descent algorithm.
Our results imply that the zero-order algorithm is nearly optimal in terms of sample complexity and the problem parameters.
arXiv Detail & Related papers (2020-06-14T10:42:23Z) - Lower Bounds for Non-Elitist Evolutionary Algorithms via Negative
Multiplicative Drift [9.853329403413701]
We show that a simple negative drift theorem for multiplicative drift scenarios can simplify existing analyses.
We discuss in more detail Lehre's (PPSN 2010) emphnegative drift in populations method, one of the most general tools to prove lower bounds on the runtime of non-elitist mutation-based evolutionary algorithms.
arXiv Detail & Related papers (2020-05-02T15:10:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.