The Influence of Local Search over Genetic Algorithms with Balanced
Representations
- URL: http://arxiv.org/abs/2206.10974v1
- Date: Wed, 22 Jun 2022 10:59:26 GMT
- Title: The Influence of Local Search over Genetic Algorithms with Balanced
Representations
- Authors: Luca Manzoni, Luca Mariot, Eva Tuba
- Abstract summary: We show that adding local search actually emphincreases the diversity among the individuals in the population.
We link these findings to some recent results on fitness landscape analysis for problems on Boolean functions.
- Score: 2.610470075814367
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We continue the study of Genetic Algorithms (GA) on combinatorial
optimization problems where the candidate solutions need to satisfy a
balancedness constraint. It has been observed that the reduction of the search
space size granted by ad-hoc crossover and mutation operators does not usually
translate to a substantial improvement of the GA performances. There is still
no clear explanation of this phenomenon, although it is suspected that a
balanced representation might yield a more irregular fitness landscape, where
it could be more difficult for GA to converge to a global optimum. In this
paper, we investigate this issue by adding a local search step to a GA with
balanced operators, and use it to evolve highly nonlinear balanced Boolean
functions. In particular, we organize our experiments around two research
questions, namely if local search (1) improves the convergence speed of GA, and
(2) decreases the population diversity. Surprisingly, while our results answer
affirmatively the first question, they also show that adding local search
actually \emph{increases} the diversity among the individuals in the
population. We link these findings to some recent results on fitness landscape
analysis for problems on Boolean functions.
Related papers
- Error Feedback under $(L_0,L_1)$-Smoothness: Normalization and Momentum [56.37522020675243]
We provide the first proof of convergence for normalized error feedback algorithms across a wide range of machine learning problems.
We show that due to their larger allowable stepsizes, our new normalized error feedback algorithms outperform their non-normalized counterparts on various tasks.
arXiv Detail & Related papers (2024-10-22T10:19:27Z) - Improving genetic algorithms performance via deterministic population
shrinkage [9.334663477968027]
This paper presents an empirical study on the possible benefits of a Simple Variable Population Sizing scheme on the performance of Genetic Algorithms (GAs)
It consists in decreasing the population for a GA run following a predetermined schedule, configured by a speed and a severity parameter.
Results show several combinations of speed-severity where SVPS-GA preserves the solution quality while improving performances, by reducing the number of evaluations needed for success.
arXiv Detail & Related papers (2024-01-22T17:05:16Z) - DynGFN: Towards Bayesian Inference of Gene Regulatory Networks with
GFlowNets [81.75973217676986]
Gene regulatory networks (GRN) describe interactions between genes and their products that control gene expression and cellular function.
Existing methods either focus on challenge (1), identifying cyclic structure from dynamics, or on challenge (2) learning complex Bayesian posteriors over DAGs, but not both.
In this paper we leverage the fact that it is possible to estimate the "velocity" of gene expression with RNA velocity techniques to develop an approach that addresses both challenges.
arXiv Detail & Related papers (2023-02-08T16:36:40Z) - Maximum Spatial Perturbation Consistency for Unpaired Image-to-Image
Translation [56.44946660061753]
This paper proposes a universal regularization technique called maximum spatial perturbation consistency (MSPC)
MSPC enforces a spatial perturbation function (T ) and the translation operator (G) to be commutative (i.e., TG = GT )
Our method outperforms the state-of-the-art methods on most I2I benchmarks.
arXiv Detail & Related papers (2022-03-23T19:59:04Z) - Temporal Heterogeneity Improves Speed and Convergence in Genetic
Algorithms [0.0]
Genetic algorithms simulate natural selection to explore a parameter space in search of solutions for a broad variety of problems.
We set the crossover probability to be inversely proportional to the individual's fitness, i.e., better solutions change slower than those with a lower fitness.
We find that temporal heterogeneity consistently improves search without having prior knowledge of the parameter space.
arXiv Detail & Related papers (2022-02-02T22:48:56Z) - Flipping the switch on local exploration: Genetic Algorithms with
Reversals [0.0]
Authors show that gradient-free search techniques are suitable for providing an optimal solution in the discrete domain.
They also show that the use of multiple local searches can improve performance on local searches.
It is observed that the proposed GA variants have the least average cost across all benchmarks including the problem proposed and IC performs better than its constituents.
arXiv Detail & Related papers (2022-02-02T08:27:11Z) - Result Diversification by Multi-objective Evolutionary Algorithms with
Theoretical Guarantees [94.72461292387146]
We propose to reformulate the result diversification problem as a bi-objective search problem, and solve it by a multi-objective evolutionary algorithm (EA)
We theoretically prove that the GSEMO can achieve the optimal-time approximation ratio, $1/2$.
When the objective function changes dynamically, the GSEMO can maintain this approximation ratio in running time, addressing the open question proposed by Borodin et al.
arXiv Detail & Related papers (2021-10-18T14:00:22Z) - Learning to Accelerate Heuristic Searching for Large-Scale Maximum
Weighted b-Matching Problems in Online Advertising [51.97494906131859]
Bipartite b-matching is fundamental in algorithm design, and has been widely applied into economic markets, labor markets, etc.
Existing exact and approximate algorithms usually fail in such settings due to either requiring intolerable running time or too much computation resource.
We propose textttNeuSearcher which leverages the knowledge learned from previously instances to solve new problem instances.
arXiv Detail & Related papers (2020-05-09T02:48:23Z) - Hardness of Random Optimization Problems for Boolean Circuits,
Low-Degree Polynomials, and Langevin Dynamics [78.46689176407936]
We show that families of algorithms fail to produce nearly optimal solutions with high probability.
For the case of Boolean circuits, our results improve the state-of-the-art bounds known in circuit complexity theory.
arXiv Detail & Related papers (2020-04-25T05:45:59Z) - Tip the Balance: Improving Exploration of Balanced Crossover Operators
by Adaptive Bias [2.610470075814367]
The use of balanced crossover operators in Genetic Algorithms (GA) ensures that the binary strings generated as offsprings have the same Hamming weight of the parents.
Although this method reduces the size of the search space, the resulting fitness landscape often becomes more difficult for the GA to explore.
This issue has been studied in this paper by applying an adaptive bias strategy to a counter-based crossover operator.
arXiv Detail & Related papers (2020-04-23T17:26:43Z) - From Understanding Genetic Drift to a Smart-Restart Parameter-less
Compact Genetic Algorithm [15.56430085052365]
In the regime with no genetic drift, the runtime is roughly proportional to the population size.
We propose a parameter-less version of the compact genetic algorithm that automatically finds a suitable population size.
arXiv Detail & Related papers (2020-04-15T15:12:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.