Preferential Batch Bayesian Optimization
- URL: http://arxiv.org/abs/2003.11435v3
- Date: Tue, 31 Aug 2021 18:02:18 GMT
- Title: Preferential Batch Bayesian Optimization
- Authors: Eero Siivola, Akash Kumar Dhaka, Michael Riis Andersen, Javier
Gonzalez, Pablo Garcia Moreno, and Aki Vehtari
- Abstract summary: preferential batch Bayesian optimization (PBBO) is a new framework that allows finding the optimum of a latent function of interest.
We show how the acquisitions developed under this framework generalize and augment previous approaches in Bayesian optimization.
- Score: 16.141259199997005
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most research in Bayesian optimization (BO) has focused on \emph{direct
feedback} scenarios, where one has access to exact values of some
expensive-to-evaluate objective. This direction has been mainly driven by the
use of BO in machine learning hyper-parameter configuration problems. However,
in domains such as modelling human preferences, A/B tests, or recommender
systems, there is a need for methods that can replace direct feedback with
\emph{preferential feedback}, obtained via rankings or pairwise comparisons. In
this work, we present preferential batch Bayesian optimization (PBBO), a new
framework that allows finding the optimum of a latent function of interest,
given any type of parallel preferential feedback for a group of two or more
points. We do so by using a Gaussian process model with a likelihood specially
designed to enable parallel and efficient data collection mechanisms, which are
key in modern machine learning. We show how the acquisitions developed under
this framework generalize and augment previous approaches in Bayesian
optimization, expanding the use of these techniques to a wider range of
domains. An extensive simulation study shows the benefits of this approach,
both with simulated functions and four real data sets.
Related papers
- Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Simulation Based Bayesian Optimization [0.6526824510982799]
This paper introduces Simulation Based Bayesian Optimization (SBBO) as a novel approach to optimizing acquisition functions.
SBBO allows the use of surrogate models tailored for spaces with discrete variables.
We demonstrate empirically the effectiveness of SBBO method using various choices of surrogate models.
arXiv Detail & Related papers (2024-01-19T16:56:11Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - MA-BBOB: Many-Affine Combinations of BBOB Functions for Evaluating
AutoML Approaches in Noiseless Numerical Black-Box Optimization Contexts [0.8258451067861933]
(MA-)BBOB is built on the publicly available IOHprofiler platform.
It provides access to the interactive IOHanalyzer module for performance analysis and visualization, and enables comparisons with the rich and growing data collection available for the (MA-)BBOB functions.
arXiv Detail & Related papers (2023-06-18T19:32:12Z) - Pre-training helps Bayesian optimization too [49.28382118032923]
We seek an alternative practice for setting functional priors.
In particular, we consider the scenario where we have data from similar functions that allow us to pre-train a tighter distribution a priori.
Our results show that our method is able to locate good hyper parameters at least 3 times more efficiently than the best competing methods.
arXiv Detail & Related papers (2022-07-07T04:42:54Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Towards Learning Universal Hyperparameter Optimizers with Transformers [57.35920571605559]
We introduce the OptFormer, the first text-based Transformer HPO framework that provides a universal end-to-end interface for jointly learning policy and function prediction.
Our experiments demonstrate that the OptFormer can imitate at least 7 different HPO algorithms, which can be further improved via its function uncertainty estimates.
arXiv Detail & Related papers (2022-05-26T12:51:32Z) - Bayesian Optimization over Permutation Spaces [30.650753803587794]
We propose and evaluate two algorithms for BO over Permutation Spaces (BOPS)
We theoretically analyze the performance of BOPS-T to show that their regret grows sub-linearly.
Our experiments on multiple synthetic and real-world benchmarks show that both BOPS-T and BOPS-H perform better than the state-of-the-art BO algorithm for spaces.
arXiv Detail & Related papers (2021-12-02T08:20:50Z) - Incorporating Expert Prior in Bayesian Optimisation via Space Warping [54.412024556499254]
In big search spaces the algorithm goes through several low function value regions before reaching the optimum of the function.
One approach to subside this cold start phase is to use prior knowledge that can accelerate the optimisation.
In this paper, we represent the prior knowledge about the function optimum through a prior distribution.
The prior distribution is then used to warp the search space in such a way that space gets expanded around the high probability region of function optimum and shrinks around low probability region of optimum.
arXiv Detail & Related papers (2020-03-27T06:18:49Z) - Towards Automatic Bayesian Optimization: A first step involving
acquisition functions [0.0]
Bayesian optimization is the state of the art technique for the optimization of black boxes, i.e., functions where we do not have access to their analytical expression.
We propose a first attempt over automatic bayesian optimization by exploring several techniques that automatically tune the acquisition function.
arXiv Detail & Related papers (2020-03-21T12:22:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.