Optimal Static Mutation Strength Distributions for the $(1+\lambda)$
Evolutionary Algorithm on OneMax
- URL: http://arxiv.org/abs/2102.04944v2
- Date: Fri, 4 Jun 2021 01:58:28 GMT
- Title: Optimal Static Mutation Strength Distributions for the $(1+\lambda)$
Evolutionary Algorithm on OneMax
- Authors: Maxim Buzdalov and Carola Doerr
- Abstract summary: We show that, for large enough population sizes, such optimal distributions may be surprisingly complicated and counter-intuitive.
We show that, for large enough population sizes, such optimal distributions may be surprisingly complicated and counter-intuitive.
- Score: 1.0965065178451106
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most evolutionary algorithms have parameters, which allow a great flexibility
in controlling their behavior and adapting them to new problems. To achieve the
best performance, it is often needed to control some of the parameters during
optimization, which gave rise to various parameter control methods. In recent
works, however, similar advantages have been shown, and even proven, for
sampling parameter values from certain, often heavy-tailed, fixed
distributions. This produced a family of algorithms currently known as "fast
evolution strategies" and "fast genetic algorithms".
However, only little is known so far about the influence of these
distributions on the performance of evolutionary algorithms, and about the
relationships between (dynamic) parameter control and (static) parameter
sampling. We contribute to the body of knowledge by presenting, for the first
time, an algorithm that computes the optimal static distributions, which
describe the mutation operator used in the well-known simple $(1+\lambda)$
evolutionary algorithm on a classic benchmark problem OneMax. We show that, for
large enough population sizes, such optimal distributions may be surprisingly
complicated and counter-intuitive. We investigate certain properties of these
distributions, and also evaluate the performance regrets of the $(1+\lambda)$
evolutionary algorithm using commonly used mutation distributions.
Related papers
- Already Moderate Population Sizes Provably Yield Strong Robustness to Noise [53.27802701790209]
We show that two evolutionary algorithms can tolerate constant noise probabilities without increasing the runtime on the OneMax benchmark.
Our results are based on a novel proof argument that the noiseless offspring can be seen as a biased uniform crossover between the parent and the noisy offspring.
arXiv Detail & Related papers (2024-04-02T16:35:52Z) - Gradient-free optimization of highly smooth functions: improved analysis
and a new algorithm [87.22224691317766]
This work studies problems with zero-order noisy oracle information under the assumption that the objective function is highly smooth.
We consider two kinds of zero-order projected gradient descent algorithms.
arXiv Detail & Related papers (2023-06-03T17:05:13Z) - Fast Computation of Optimal Transport via Entropy-Regularized Extragradient Methods [75.34939761152587]
Efficient computation of the optimal transport distance between two distributions serves as an algorithm that empowers various applications.
This paper develops a scalable first-order optimization-based method that computes optimal transport to within $varepsilon$ additive accuracy.
arXiv Detail & Related papers (2023-01-30T15:46:39Z) - Massively Parallel Genetic Optimization through Asynchronous Propagation
of Populations [50.591267188664666]
Propulate is an evolutionary optimization algorithm and software package for global optimization.
We provide an MPI-based implementation of our algorithm, which features variants of selection, mutation, crossover, and migration.
We find that Propulate is up to three orders of magnitude faster without sacrificing solution accuracy.
arXiv Detail & Related papers (2023-01-20T18:17:34Z) - Adaptive Random Quantum Eigensolver [0.0]
We introduce a general method to parametrize and optimize the probability density function of a random number generator.
Our optimization is based on two figures of merit: learning speed and learning accuracy.
arXiv Detail & Related papers (2021-06-28T12:01:05Z) - A Study of the Fundamental Parameters of Particle Swarm Optimizers [0.0]
Population-based algorithms can handle different optimization problems with few or no adaptations.
Their main drawbacks consist of their comparatively higher computational cost and difficulty in handling equality constraints.
This paper deals with the effect of the settings of the parameters of the particles' velocity update equation on the behaviour of the system.
arXiv Detail & Related papers (2021-01-25T01:18:34Z) - Evolutionary Variational Optimization of Generative Models [0.0]
We combine two popular optimization approaches to derive learning algorithms for generative models: variational optimization and evolutionary algorithms.
We show that evolutionary algorithms can effectively and efficiently optimize the variational bound.
In the category of "zero-shot" learning, we observed the evolutionary variational algorithm to significantly improve the state-of-the-art in many benchmark settings.
arXiv Detail & Related papers (2020-12-22T19:06:33Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Fast Mutation in Crossover-based Algorithms [8.34061303235504]
Heavy-tailed mutation operator proposed in Doerr, Le, Makhmara and Nguyen (GECCO)
A heavy-tailed mutation operator proposed in Doerr, Le, Makhmara and Nguyen (GECCO)
An empirical study shows the effectiveness of the fast mutation also on random satisfiable Max-3SAT instances.
arXiv Detail & Related papers (2020-04-14T14:16:42Z) - Self-Adjusting Evolutionary Algorithms for Multimodal Optimization [0.0]
Self-adjusting and self-adaptive mechanisms can provably outperform static settings in evolutionary algorithms for binary search spaces.
We suggest a mechanism called stagnation detection that can be added as a module to existing evolutionary algorithms.
arXiv Detail & Related papers (2020-04-07T11:02:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.