Discovering Attention-Based Genetic Algorithms via Meta-Black-Box
Optimization
- URL: http://arxiv.org/abs/2304.03995v1
- Date: Sat, 8 Apr 2023 12:14:15 GMT
- Title: Discovering Attention-Based Genetic Algorithms via Meta-Black-Box
Optimization
- Authors: Robert Tjarko Lange, Tom Schaul, Yutian Chen, Chris Lu, Tom Zahavy,
Valentin Dalibard, Sebastian Flennerhag
- Abstract summary: We discover entirely new genetic algorithms in a data-driven fashion.
We parametrize selection and mutation rate adaptation as cross- and self-attention modules.
The learned algorithm can be applied to previously unseen optimization problems, search dimensions & evaluation budgets.
- Score: 13.131971623143622
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Genetic algorithms constitute a family of black-box optimization algorithms,
which take inspiration from the principles of biological evolution. While they
provide a general-purpose tool for optimization, their particular
instantiations can be heuristic and motivated by loose biological intuition. In
this work we explore a fundamentally different approach: Given a sufficiently
flexible parametrization of the genetic operators, we discover entirely new
genetic algorithms in a data-driven fashion. More specifically, we parametrize
selection and mutation rate adaptation as cross- and self-attention modules and
use Meta-Black-Box-Optimization to evolve their parameters on a set of diverse
optimization tasks. The resulting Learned Genetic Algorithm outperforms
state-of-the-art adaptive baseline genetic algorithms and generalizes far
beyond its meta-training settings. The learned algorithm can be applied to
previously unseen optimization problems, search dimensions & evaluation
budgets. We conduct extensive analysis of the discovered operators and provide
ablation experiments, which highlight the benefits of flexible module
parametrization and the ability to transfer (`plug-in') the learned operators
to conventional genetic algorithms.
Related papers
- Discovering Preference Optimization Algorithms with and for Large Language Models [50.843710797024805]
offline preference optimization is a key method for enhancing and controlling the quality of Large Language Model (LLM) outputs.
We perform objective discovery to automatically discover new state-of-the-art preference optimization algorithms without (expert) human intervention.
Experiments demonstrate the state-of-the-art performance of DiscoPOP, a novel algorithm that adaptively blends logistic and exponential losses.
arXiv Detail & Related papers (2024-06-12T16:58:41Z) - Genetic Engineering Algorithm (GEA): An Efficient Metaheuristic
Algorithm for Solving Combinatorial Optimization Problems [1.8434042562191815]
Genetic Algorithms (GAs) are known for their efficiency in solving optimization problems.
This paper proposes a new metaheuristic algorithm called the Genetic Engineering Algorithm (GEA) that draws inspiration from genetic engineering concepts.
arXiv Detail & Related papers (2023-09-28T13:05:30Z) - The FAIRy Tale of Genetic Algorithms [1.0957528713294875]
We have extended Findable, Accessible, Interoperable and Reusable (FAIR) data principles to enable Genetic and reusability of algorithms.
We have presented an overview of methodological developments and variants of GA that makes it challenging to reproduce or even find the right source.
This work can be extended to numerous machine learning algorithms/methods.
arXiv Detail & Related papers (2023-04-29T11:36:09Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Towards the optimization of ballistics in proton therapy using genetic
algorithms: implementation issues [0.0]
We investigate a new optimization framework based on a genetic algorithm approach.
The proposed optimization routine takes typically into account several thousands of spots of fixed size.
The behavior of the proposed genetic algorithm is illustrated in both elementary and clinically-realistic test cases.
arXiv Detail & Related papers (2022-05-17T12:31:14Z) - Applications of Gaussian Mutation for Self Adaptation in Evolutionary
Genetic Algorithms [0.0]
In 1960, the first genetic algorithm was developed by John H. Holland and his student.
We explore the mathematical intuition of the genetic algorithm in developing systems capable of evolving using Gaussian mutation.
arXiv Detail & Related papers (2022-01-02T04:18:25Z) - Improving RNA Secondary Structure Design using Deep Reinforcement
Learning [69.63971634605797]
We propose a new benchmark of applying reinforcement learning to RNA sequence design, in which the objective function is defined to be the free energy in the sequence's secondary structure.
We show results of the ablation analysis that we do for these algorithms, as well as graphs indicating the algorithm's performance across batches.
arXiv Detail & Related papers (2021-11-05T02:54:06Z) - Result Diversification by Multi-objective Evolutionary Algorithms with
Theoretical Guarantees [94.72461292387146]
We propose to reformulate the result diversification problem as a bi-objective search problem, and solve it by a multi-objective evolutionary algorithm (EA)
We theoretically prove that the GSEMO can achieve the optimal-time approximation ratio, $1/2$.
When the objective function changes dynamically, the GSEMO can maintain this approximation ratio in running time, addressing the open question proposed by Borodin et al.
arXiv Detail & Related papers (2021-10-18T14:00:22Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - AdaLead: A simple and robust adaptive greedy search algorithm for
sequence design [55.41644538483948]
We develop an easy-to-directed, scalable, and robust evolutionary greedy algorithm (AdaLead)
AdaLead is a remarkably strong benchmark that out-competes more complex state of the art approaches in a variety of biologically motivated sequence design challenges.
arXiv Detail & Related papers (2020-10-05T16:40:38Z) - Genetic optimization algorithms applied toward mission computability
models [0.3655021726150368]
Genetic algorithms are computations based and low cost to compute.
We describe our genetic optimization algorithms to a mission-critical and constraints-aware problem.
arXiv Detail & Related papers (2020-05-27T00:45:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.