A General Framework for User-Guided Bayesian Optimization
- URL: http://arxiv.org/abs/2311.14645v2
- Date: Sat, 17 Feb 2024 19:06:40 GMT
- Title: A General Framework for User-Guided Bayesian Optimization
- Authors: Carl Hvarfner and Frank Hutter and Luigi Nardi
- Abstract summary: We propose ColaBO, the first Bayesian-principled framework for prior beliefs beyond the typical kernel structure.
We empirically demonstrate ColaBO's ability to substantially accelerate optimization when the prior information is accurate, and to retain approximately default performance when it is misleading.
- Score: 51.96352579696041
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The optimization of expensive-to-evaluate black-box functions is prevalent in
various scientific disciplines. Bayesian optimization is an automatic, general
and sample-efficient method to solve these problems with minimal knowledge of
the underlying function dynamics. However, the ability of Bayesian optimization
to incorporate prior knowledge or beliefs about the function at hand in order
to accelerate the optimization is limited, which reduces its appeal for
knowledgeable practitioners with tight budgets. To allow domain experts to
customize the optimization routine, we propose ColaBO, the first
Bayesian-principled framework for incorporating prior beliefs beyond the
typical kernel structure, such as the likely location of the optimizer or the
optimal value. The generality of ColaBO makes it applicable across different
Monte Carlo acquisition functions and types of user beliefs. We empirically
demonstrate ColaBO's ability to substantially accelerate optimization when the
prior information is accurate, and to retain approximately default performance
when it is misleading.
Related papers
- Cost-aware Bayesian Optimization via the Pandora's Box Gittins Index [57.045952766988925]
We develop a previously-unexplored connection between cost-aware Bayesian optimization and the Pandora's Box problem, a decision problem from economics.
Our work constitutes a first step towards integrating techniques from Gittins index theory into Bayesian optimization.
arXiv Detail & Related papers (2024-06-28T17:20:13Z) - Enhanced Bayesian Optimization via Preferential Modeling of Abstract
Properties [49.351577714596544]
We propose a human-AI collaborative Bayesian framework to incorporate expert preferences about unmeasured abstract properties into surrogate modeling.
We provide an efficient strategy that can also handle any incorrect/misleading expert bias in preferential judgments.
arXiv Detail & Related papers (2024-02-27T09:23:13Z) - Optimistic Optimization of Gaussian Process Samples [30.226274682578172]
A competing, computationally more efficient, global optimization framework is optimistic optimization, which exploits prior knowledge about the geometry of the search space in form of a dissimilarity function.
We argue that there is a new research domain between geometric and probabilistic search, i.e. methods that run drastically faster than traditional Bayesian optimization, while retaining some of the crucial functionality of Bayesian optimization.
arXiv Detail & Related papers (2022-09-02T09:06:24Z) - Pre-training helps Bayesian optimization too [49.28382118032923]
We seek an alternative practice for setting functional priors.
In particular, we consider the scenario where we have data from similar functions that allow us to pre-train a tighter distribution a priori.
Our results show that our method is able to locate good hyper parameters at least 3 times more efficiently than the best competing methods.
arXiv Detail & Related papers (2022-07-07T04:42:54Z) - Are we Forgetting about Compositional Optimisers in Bayesian
Optimisation? [66.39551991177542]
This paper presents a sample methodology for global optimisation.
Within this, a crucial performance-determiningtrivial is maximising the acquisition function.
We highlight the empirical advantages of the approach to optimise functionation across 3958 individual experiments.
arXiv Detail & Related papers (2020-12-15T12:18:38Z) - Bayesian Optimization with a Prior for the Optimum [41.41323474440455]
We introduce Bayesian Optimization with a Prior for the Optimum (BOPrO)
BOPrO allows users to inject their knowledge into the optimization process in the form of priors about which parts of the input space will yield the best performance.
We show that BOPrO is around 6.67x faster than state-of-the-art methods on a common suite of benchmarks.
arXiv Detail & Related papers (2020-06-25T17:49:24Z) - Incorporating Expert Prior in Bayesian Optimisation via Space Warping [54.412024556499254]
In big search spaces the algorithm goes through several low function value regions before reaching the optimum of the function.
One approach to subside this cold start phase is to use prior knowledge that can accelerate the optimisation.
In this paper, we represent the prior knowledge about the function optimum through a prior distribution.
The prior distribution is then used to warp the search space in such a way that space gets expanded around the high probability region of function optimum and shrinks around low probability region of optimum.
arXiv Detail & Related papers (2020-03-27T06:18:49Z) - Composition of kernel and acquisition functions for High Dimensional
Bayesian Optimization [0.1749935196721634]
We use the addition-ality of the objective function into mapping both the kernel and the acquisition function of the Bayesian Optimization.
This ap-proach makes more efficient the learning/updating of the probabilistic surrogate model.
Results are presented for real-life application, that is the control of pumps in urban water distribution systems.
arXiv Detail & Related papers (2020-03-09T15:45:57Z) - Scalable Constrained Bayesian Optimization [10.820024633762596]
The global optimization of a high-dimensional black-box function under black-box constraints is a pervasive task in machine learning, control, and the scientific community.
We propose the scalable constrained Bayesian optimization (SCBO) algorithm that overcomes the above challenges and pushes the state-the-art.
arXiv Detail & Related papers (2020-02-20T01:48:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.