A Joint Python/C++ Library for Efficient yet Accessible Black-Box and
Gray-Box Optimization with GOMEA
- URL: http://arxiv.org/abs/2305.06246v1
- Date: Wed, 10 May 2023 15:28:31 GMT
- Title: A Joint Python/C++ Library for Efficient yet Accessible Black-Box and
Gray-Box Optimization with GOMEA
- Authors: Anton Bouter and Peter A.N. Bosman
- Abstract summary: We introduce the GOMEA library, making existing GOMEA code in C++ accessible through Python.
We show its performance in both Gray-Box Optimization (GBO) and Black-Box Optimization (BBO)
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Exploiting knowledge about the structure of a problem can greatly benefit the
efficiency and scalability of an Evolutionary Algorithm (EA). Model-Based EAs
(MBEAs) are capable of doing this by explicitly modeling the problem structure.
The Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) is among the
state-of-the-art of MBEAs due to its use of a linkage model and the optimal
mixing variation operator. Especially in a Gray-Box Optimization (GBO) setting
that allows for partial evaluations, i.e., the relatively efficient evaluation
of a partial modification of a solution, GOMEA is known to excel. Such GBO
settings are known to exist in various real-world applications to which GOMEA
has successfully been applied. In this work, we introduce the GOMEA library,
making existing GOMEA code in C++ accessible through Python, which serves as a
centralized way of maintaining and distributing code of GOMEA for various
optimization domains. Moreover, it allows for the straightforward definition of
BBO as well as GBO fitness functions within Python, which are called from the
C++ optimization code for each required (partial) evaluation. We describe the
structure of the GOMEA library and how it can be used, and we show its
performance in both GBO and Black-Box Optimization (BBO).
Related papers
- Analyzing the Runtime of the Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) on the Concatenated Trap Function [2.038038953957366]
GOMEA is an evolutionary algorithm that leverages linkage learning to efficiently exploit problem structure.
We show that GOMEA can solve the problem in $O(m32k)$ with high probability, where $m$ is the number of subfunctions and $k$ is the subfunction length.
This is a significant speedup compared to the (1+1) Evolutionary EA, which requires $O(ln(m)(mk)k)$ expected evaluations.
arXiv Detail & Related papers (2024-07-11T09:37:21Z) - Reinforced In-Context Black-Box Optimization [64.25546325063272]
RIBBO is a method to reinforce-learn a BBO algorithm from offline data in an end-to-end fashion.
RIBBO employs expressive sequence models to learn the optimization histories produced by multiple behavior algorithms and tasks.
Central to our method is to augment the optimization histories with textitregret-to-go tokens, which are designed to represent the performance of an algorithm based on cumulative regret over the future part of the histories.
arXiv Detail & Related papers (2024-02-27T11:32:14Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - PyBADS: Fast and robust black-box optimization in Python [11.4219428942199]
PyBADS is an implementation of the Adaptive Direct Search (BADS) algorithm for fast and robust black-box optimization.
It comes along with an easy-to-use Python interface for running the algorithm for running the results.
arXiv Detail & Related papers (2023-06-27T15:54:44Z) - NUBO: A Transparent Python Package for Bayesian Optimization [0.0]
NUBO is a framework for optimizing black-box functions, such as physical experiments and computer simulators.
It focuses on transparency and user experience to make Bayesian optimization accessible to researchers from all disciplines.
NUBO is written in Python but does not require expert knowledge of Python to optimize simulators and experiments.
arXiv Detail & Related papers (2023-05-11T10:34:27Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Bayesian Optimization for Macro Placement [48.55456716632735]
We develop a novel approach to macro placement using Bayesian optimization (BO) over sequence pairs.
BO is a machine learning technique that uses a probabilistic surrogate model and an acquisition function.
We demonstrate our algorithm on the fixed-outline macro placement problem with the half-perimeter wire length objective.
arXiv Detail & Related papers (2022-07-18T06:17:06Z) - GPU-Accelerated Parallel Gene-pool Optimal Mixing in a Gray-Box
Optimization Setting [0.0]
We show how graph coloring can be used to group sets of variables that can undergo variation in parallel without violating dependencies.
We find that, for sufficiently large graphs with limited connectivity, finding high-quality solutions can be achieved up to 100 times faster.
arXiv Detail & Related papers (2022-03-16T15:08:48Z) - Parameterless Gene-pool Optimal Mixing Evolutionary Algorithms [0.0]
We present the latest version of, and propose substantial enhancements to, the Gene-pool Optimal Mixing Evoutionary Algorithm (GOMEA)
We show that GOMEA and CGOMEA significantly outperform the original GOMEA and DSMGA-II on most problems.
arXiv Detail & Related papers (2021-09-11T11:35:14Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - TREGO: a Trust-Region Framework for Efficient Global Optimization [63.995130144110156]
We propose and analyze a trust-region-like EGO method (TREGO)
TREGO alternates between regular EGO steps and local steps within a trust region.
Our algorithm enjoys strong global convergence properties, while departing from EGO only for a subset of optimization steps.
arXiv Detail & Related papers (2021-01-18T00:14:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.