MA-BBOB: Many-Affine Combinations of BBOB Functions for Evaluating
AutoML Approaches in Noiseless Numerical Black-Box Optimization Contexts
- URL: http://arxiv.org/abs/2306.10627v1
- Date: Sun, 18 Jun 2023 19:32:12 GMT
- Title: MA-BBOB: Many-Affine Combinations of BBOB Functions for Evaluating
AutoML Approaches in Noiseless Numerical Black-Box Optimization Contexts
- Authors: Diederick Vermetten, Furong Ye, Thomas B\"ack, Carola Doerr
- Abstract summary: (MA-)BBOB is built on the publicly available IOHprofiler platform.
It provides access to the interactive IOHanalyzer module for performance analysis and visualization, and enables comparisons with the rich and growing data collection available for the (MA-)BBOB functions.
- Score: 0.8258451067861933
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Extending a recent suggestion to generate new instances for numerical
black-box optimization benchmarking by interpolating pairs of the
well-established BBOB functions from the COmparing COntinuous Optimizers (COCO)
platform, we propose in this work a further generalization that allows multiple
affine combinations of the original instances and arbitrarily chosen locations
of the global optima. We demonstrate that the MA-BBOB generator can help fill
the instance space, while overall patterns in algorithm performance are
preserved. By combining the landscape features of the problems with the
performance data, we pose the question of whether these features are as useful
for algorithm selection as previous studies suggested. MA-BBOB is built on the
publicly available IOHprofiler platform, which facilitates standardized
experimentation routines, provides access to the interactive IOHanalyzer module
for performance analysis and visualization, and enables comparisons with the
rich and growing data collection available for the (MA-)BBOB functions.
Related papers
- Landscape-Aware Automated Algorithm Configuration using Multi-output Mixed Regression and Classification [0.01649298969786889]
We investigate the potential of randomly generated functions (RGF) for the model training.
We focus on automated algorithm configuration (AAC)
We analyze the performance of dense neural network (NN) models in handling the mixed regression and classification tasks.
arXiv Detail & Related papers (2024-09-02T20:04:41Z) - Impact of Training Instance Selection on Automated Algorithm Selection Models for Numerical Black-box Optimization [0.40498500266986387]
We show that MA-BBOB-generated functions can be an ideal testbed for automated machine learning methods.
We analyze the potential gains from AAS by studying performance complementarity within a set of eight algorithms.
We show that simply using the BBOB component functions for training yields poor test performance.
arXiv Detail & Related papers (2024-04-11T08:03:53Z) - Reinforced In-Context Black-Box Optimization [64.25546325063272]
RIBBO is a method to reinforce-learn a BBO algorithm from offline data in an end-to-end fashion.
RIBBO employs expressive sequence models to learn the optimization histories produced by multiple behavior algorithms and tasks.
Central to our method is to augment the optimization histories with textitregret-to-go tokens, which are designed to represent the performance of an algorithm based on cumulative regret over the future part of the histories.
arXiv Detail & Related papers (2024-02-27T11:32:14Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Simulation Based Bayesian Optimization [0.6526824510982799]
This paper introduces Simulation Based Bayesian Optimization (SBBO) as a novel approach to optimizing acquisition functions.
SBBO allows the use of surrogate models tailored for spaces with discrete variables.
We demonstrate empirically the effectiveness of SBBO method using various choices of surrogate models.
arXiv Detail & Related papers (2024-01-19T16:56:11Z) - MA-BBOB: A Problem Generator for Black-Box Optimization Using Affine
Combinations and Shifts [1.2617078020344619]
We present the MA-BBOB function generator, which uses the BBOB suite as component functions in an affine combination.
We show a potential use-case of MA-BBOB in generating a wide set of training and testing data for algorithm selectors.
arXiv Detail & Related papers (2023-12-18T10:23:09Z) - Tree ensemble kernels for Bayesian optimization with known constraints
over mixed-feature spaces [54.58348769621782]
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search.
Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function.
Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.
arXiv Detail & Related papers (2022-07-02T16:59:37Z) - A General Recipe for Likelihood-free Bayesian Optimization [115.82591413062546]
We propose likelihood-free BO (LFBO) to extend BO to a broader class of models and utilities.
LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model.
We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem.
arXiv Detail & Related papers (2022-06-27T03:55:27Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Preferential Batch Bayesian Optimization [16.141259199997005]
preferential batch Bayesian optimization (PBBO) is a new framework that allows finding the optimum of a latent function of interest.
We show how the acquisitions developed under this framework generalize and augment previous approaches in Bayesian optimization.
arXiv Detail & Related papers (2020-03-25T14:59:15Z) - Stepwise Model Selection for Sequence Prediction via Deep Kernel
Learning [100.83444258562263]
We propose a novel Bayesian optimization (BO) algorithm to tackle the challenge of model selection in this setting.
In order to solve the resulting multiple black-box function optimization problem jointly and efficiently, we exploit potential correlations among black-box functions.
We are the first to formulate the problem of stepwise model selection (SMS) for sequence prediction, and to design and demonstrate an efficient joint-learning algorithm for this purpose.
arXiv Detail & Related papers (2020-01-12T09:42:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.