Self-adaptation in non-Elitist Evolutionary Algorithms on Discrete
Problems with Unknown Structure
- URL: http://arxiv.org/abs/2004.00327v1
- Date: Wed, 1 Apr 2020 10:35:45 GMT
- Title: Self-adaptation in non-Elitist Evolutionary Algorithms on Discrete
Problems with Unknown Structure
- Authors: Brendan Case and Per Kristian Lehre
- Abstract summary: Key challenge to make effective use of evolutionary algorithms is to choose appropriate settings for their parameters.
Non-deterministic parameter control mechanisms adjust parameters using information obtained from the evolutionary process.
Self-adaptation is a popular parameter control mechanism in evolutionary strategies.
- Score: 2.28438857884398
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A key challenge to make effective use of evolutionary algorithms is to choose
appropriate settings for their parameters. However, the appropriate parameter
setting generally depends on the structure of the optimisation problem, which
is often unknown to the user. Non-deterministic parameter control mechanisms
adjust parameters using information obtained from the evolutionary process.
Self-adaptation -- where parameter settings are encoded in the chromosomes of
individuals and evolve through mutation and crossover -- is a popular parameter
control mechanism in evolutionary strategies. However, there is little
theoretical evidence that self-adaptation is effective, and self-adaptation has
largely been ignored by the discrete evolutionary computation community.
Here we show through a theoretical runtime analysis that a non-elitist,
discrete evolutionary algorithm which self-adapts its mutation rate not only
outperforms EAs which use static mutation rates on \leadingones, but also
improves asymptotically on an EA using a state-of-the-art control mechanism.
The structure of this problem depends on a parameter $k$, which is \emph{a
priori} unknown to the algorithm, and which is needed to appropriately set a
fixed mutation rate. The self-adaptive EA achieves the same asymptotic runtime
as if this parameter was known to the algorithm beforehand, which is an
asymptotic speedup for this problem compared to all other EAs previously
studied. An experimental study of how the mutation-rates evolve show that they
respond adequately to a diverse range of problem structures.
These results suggest that self-adaptation should be adopted more broadly as
a parameter control mechanism in discrete, non-elitist evolutionary algorithms.
Related papers
- Effective Adaptive Mutation Rates for Program Synthesis [3.2228025627337864]
Problem-solving performance of evolutionary algorithms depends on mutation rates.
We propose an adaptive bandit-based mutation scheme that removes the need to specify a mutation rate.
Results on software synthesis and symbolic regression problems validate the effectiveness of our approach.
arXiv Detail & Related papers (2024-06-23T00:56:37Z) - Benchmarking Differential Evolution on a Quantum Simulator [0.0]
Differential Evolution (DE) can be used to compute the minima of functions such as the rastrigin function and rosenbrock function.
This work is an attempt to study the result of applying the DE method on these functions with candidate individuals generated on classical Turing modeled computation.
arXiv Detail & Related papers (2023-11-06T14:27:00Z) - Phylogeny-informed fitness estimation [58.720142291102135]
We propose phylogeny-informed fitness estimation, which exploits a population's phylogeny to estimate fitness evaluations.
Our results indicate that phylogeny-informed fitness estimation can mitigate the drawbacks of down-sampled lexicase.
This work serves as an initial step toward improving evolutionary algorithms by exploiting runtime phylogenetic analysis.
arXiv Detail & Related papers (2023-06-06T19:05:01Z) - Score-based Causal Representation Learning with Interventions [54.735484409244386]
This paper studies the causal representation learning problem when latent causal variables are observed indirectly.
The objectives are: (i) recovering the unknown linear transformation (up to scaling) and (ii) determining the directed acyclic graph (DAG) underlying the latent variables.
arXiv Detail & Related papers (2023-01-19T18:39:48Z) - Effective Mutation Rate Adaptation through Group Elite Selection [50.88204196504888]
This paper introduces the Group Elite Selection of Mutation Rates (GESMR) algorithm.
GESMR co-evolves a population of solutions and a population of MRs, such that each MR is assigned to a group of solutions.
With the same number of function evaluations and with almost no overhead, GESMR converges faster and to better solutions than previous approaches.
arXiv Detail & Related papers (2022-04-11T01:08:26Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Optimal Static Mutation Strength Distributions for the $(1+\lambda)$
Evolutionary Algorithm on OneMax [1.0965065178451106]
We show that, for large enough population sizes, such optimal distributions may be surprisingly complicated and counter-intuitive.
We show that, for large enough population sizes, such optimal distributions may be surprisingly complicated and counter-intuitive.
arXiv Detail & Related papers (2021-02-09T16:56:25Z) - Learning adaptive differential evolution algorithm from optimization
experiences by policy gradient [24.2122434523704]
This paper proposes a novel adaptive parameter control approach based on learning from the optimization experiences over a set of problems.
A reinforcement learning algorithm, named policy, is applied to learn an agent that can provide the control parameters of a proposed differential evolution adaptively.
The proposed algorithm performs competitively against nine well-known evolutionary algorithms on the CEC'13 and CEC'17 test suites.
arXiv Detail & Related papers (2021-02-06T12:01:20Z) - The Variational Method of Moments [65.91730154730905]
conditional moment problem is a powerful formulation for describing structural causal parameters in terms of observables.
Motivated by a variational minimax reformulation of OWGMM, we define a very general class of estimators for the conditional moment problem.
We provide algorithms for valid statistical inference based on the same kind of variational reformulations.
arXiv Detail & Related papers (2020-12-17T07:21:06Z) - AdaLead: A simple and robust adaptive greedy search algorithm for
sequence design [55.41644538483948]
We develop an easy-to-directed, scalable, and robust evolutionary greedy algorithm (AdaLead)
AdaLead is a remarkably strong benchmark that out-competes more complex state of the art approaches in a variety of biologically motivated sequence design challenges.
arXiv Detail & Related papers (2020-10-05T16:40:38Z) - Self-Adjusting Evolutionary Algorithms for Multimodal Optimization [0.0]
Self-adjusting and self-adaptive mechanisms can provably outperform static settings in evolutionary algorithms for binary search spaces.
We suggest a mechanism called stagnation detection that can be added as a module to existing evolutionary algorithms.
arXiv Detail & Related papers (2020-04-07T11:02:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.