Evolutionary Algorithms with Self-adjusting Asymmetric Mutation
- URL: http://arxiv.org/abs/2006.09126v1
- Date: Tue, 16 Jun 2020 13:16:50 GMT
- Title: Evolutionary Algorithms with Self-adjusting Asymmetric Mutation
- Authors: Amirhossein Rajabi and Carsten Witt
- Abstract summary: We analyze an asymmetric mutation operator that can treat zero- and one-bits differently.
We show improved runtime results on the class of functions OneMax$_a$ describing the number of matching bits.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Evolutionary Algorithms (EAs) and other randomized search heuristics are
often considered as unbiased algorithms that are invariant with respect to
different transformations of the underlying search space. However, if a certain
amount of domain knowledge is available the use of biased search operators in
EAs becomes viable. We consider a simple (1+1) EA for binary search spaces and
analyze an asymmetric mutation operator that can treat zero- and one-bits
differently. This operator extends previous work by Jansen and Sudholt (ECJ
18(1), 2010) by allowing the operator asymmetry to vary according to the
success rate of the algorithm. Using a self-adjusting scheme that learns an
appropriate degree of asymmetry, we show improved runtime results on the class
of functions OneMax$_a$ describing the number of matching bits with a fixed
target $a\in\{0,1\}^n$.
Related papers
- Accelerated Discovery of Machine-Learned Symmetries: Deriving the
Exceptional Lie Groups G2, F4 and E6 [55.41644538483948]
This letter introduces two improved algorithms that significantly speed up the discovery of symmetry transformations.
Given the significant complexity of the exceptional Lie groups, our results demonstrate that this machine-learning method for discovering symmetries is completely general and can be applied to a wide variety of labeled datasets.
arXiv Detail & Related papers (2023-07-10T20:25:44Z) - Fast Computation of Optimal Transport via Entropy-Regularized Extragradient Methods [75.34939761152587]
Efficient computation of the optimal transport distance between two distributions serves as an algorithm that empowers various applications.
This paper develops a scalable first-order optimization-based method that computes optimal transport to within $varepsilon$ additive accuracy.
arXiv Detail & Related papers (2023-01-30T15:46:39Z) - Multi-block-Single-probe Variance Reduced Estimator for Coupled
Compositional Optimization [49.58290066287418]
We propose a novel method named Multi-block-probe Variance Reduced (MSVR) to alleviate the complexity of compositional problems.
Our results improve upon prior ones in several aspects, including the order of sample complexities and dependence on strongity.
arXiv Detail & Related papers (2022-07-18T12:03:26Z) - Runtime Analysis for Permutation-based Evolutionary Algorithms [9.044970217182117]
We show that a heavy-tailed version of the scramble operator, as in the bit-string case, leads to a speed-up of order $mTheta(m)$ on jump functions with jump size $m$.
A short empirical analysis confirms these findings, but also reveals that small implementation details like the rate of void mutations can make an important difference.
arXiv Detail & Related papers (2022-07-05T15:49:29Z) - Quantum Goemans-Williamson Algorithm with the Hadamard Test and
Approximate Amplitude Constraints [62.72309460291971]
We introduce a variational quantum algorithm for Goemans-Williamson algorithm that uses only $n+1$ qubits.
Efficient optimization is achieved by encoding the objective matrix as a properly parameterized unitary conditioned on an auxilary qubit.
We demonstrate the effectiveness of our protocol by devising an efficient quantum implementation of the Goemans-Williamson algorithm for various NP-hard problems.
arXiv Detail & Related papers (2022-06-30T03:15:23Z) - A gradient estimator via L1-randomization for online zero-order
optimization with two point feedback [93.57603470949266]
We present a novel gradient estimator based on two function evaluation and randomization.
We consider two types of assumptions on the noise of the zero-order oracle: canceling noise and adversarial noise.
We provide an anytime and completely data-driven algorithm, which is adaptive to all parameters of the problem.
arXiv Detail & Related papers (2022-05-27T11:23:57Z) - Alternatives to a nonhomogeneous partial differential equation quantum
algorithm [52.77024349608834]
We propose a quantum algorithm for solving nonhomogeneous linear partial differential equations of the form $Apsi(textbfr)=f(textbfr)$.
These achievements enable easier experimental implementation of the quantum algorithm based on nowadays technology.
arXiv Detail & Related papers (2022-05-11T14:29:39Z) - Adaptive and Universal Algorithms for Variational Inequalities with
Optimal Convergence [29.189409618561964]
We develop new adaptive algorithms for variational inequalities with monotone operators.
Our algorithms automatically adapt to unknown problem parameters.
We show that our algorithms are universal and simultaneously achieve the optimal convergence rates.
arXiv Detail & Related papers (2020-10-15T14:44:26Z) - The $(1+(\lambda,\lambda))$ Genetic Algorithm for Permutations [0.0]
We show that the $(lambda,lambda)$ genetic algorithm finds the optimum in $O(n2)$ fitness queries.
We also present the first analysis of this algorithm on a permutation-based problem called Ham.
arXiv Detail & Related papers (2020-04-18T17:04:57Z) - Fast Mutation in Crossover-based Algorithms [8.34061303235504]
Heavy-tailed mutation operator proposed in Doerr, Le, Makhmara and Nguyen (GECCO)
A heavy-tailed mutation operator proposed in Doerr, Le, Makhmara and Nguyen (GECCO)
An empirical study shows the effectiveness of the fast mutation also on random satisfiable Max-3SAT instances.
arXiv Detail & Related papers (2020-04-14T14:16:42Z) - Self-Adjusting Evolutionary Algorithms for Multimodal Optimization [0.0]
Self-adjusting and self-adaptive mechanisms can provably outperform static settings in evolutionary algorithms for binary search spaces.
We suggest a mechanism called stagnation detection that can be added as a module to existing evolutionary algorithms.
arXiv Detail & Related papers (2020-04-07T11:02:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.