FigBO: A Generalized Acquisition Function Framework with Look-Ahead Capability for Bayesian Optimization
- URL: http://arxiv.org/abs/2504.20307v1
- Date: Mon, 28 Apr 2025 23:35:17 GMT
- Title: FigBO: A Generalized Acquisition Function Framework with Look-Ahead Capability for Bayesian Optimization
- Authors: Hui Chen, Xuhui Fan, Zhangkai Wu, Longbing Cao,
- Abstract summary: FigBO is a generalized acquisition function that incorporates the future impact of candidate points on global information gain.<n>We show that FigBO achieves state-of-the-art performance and significantly faster convergence compared to existing methods.
- Score: 33.22407166798102
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian optimization is a powerful technique for optimizing expensive-to-evaluate black-box functions, consisting of two main components: a surrogate model and an acquisition function. In recent years, myopic acquisition functions have been widely adopted for their simplicity and effectiveness. However, their lack of look-ahead capability limits their performance. To address this limitation, we propose FigBO, a generalized acquisition function that incorporates the future impact of candidate points on global information gain. FigBO is a plug-and-play method that can integrate seamlessly with most existing myopic acquisition functions. Theoretically, we analyze the regret bound and convergence rate of FigBO when combined with the myopic base acquisition function expected improvement (EI), comparing them to those of standard EI. Empirically, extensive experimental results across diverse tasks demonstrate that FigBO achieves state-of-the-art performance and significantly faster convergence compared to existing methods.
Related papers
- A Unified Framework for Entropy Search and Expected Improvement in Bayesian Optimization [6.745285924097932]
We introduce a unified theoretical framework, Variational Entropy Search, which reveals that EI and information-theoretic acquisition functions are more closely related than previously recognized.<n>We propose VES-Gamma, a novel acquisition function that balances the strengths of EI and MES.
arXiv Detail & Related papers (2025-01-30T21:15:00Z) - Non-Myopic Multi-Objective Bayesian Optimization [64.31753000439514]
We consider the problem of finite-horizon sequential experimental design to solve multi-objective optimization problems.<n>This problem arises in many real-world applications, including materials design.<n>We propose the first set of non-myopic methods for MOO problems.
arXiv Detail & Related papers (2024-12-11T04:05:29Z) - Cost-Sensitive Multi-Fidelity Bayesian Optimization with Transfer of Learning Curve Extrapolation [55.75188191403343]
We introduce utility, which is a function predefined by each user and describes the trade-off between cost and performance of BO.
We validate our algorithm on various LC datasets and found it outperform all the previous multi-fidelity BO and transfer-BO baselines we consider.
arXiv Detail & Related papers (2024-05-28T07:38:39Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - A General Framework for User-Guided Bayesian Optimization [51.96352579696041]
We propose ColaBO, the first Bayesian-principled framework for prior beliefs beyond the typical kernel structure.
We empirically demonstrate ColaBO's ability to substantially accelerate optimization when the prior information is accurate, and to retain approximately default performance when it is misleading.
arXiv Detail & Related papers (2023-11-24T18:27:26Z) - BOtied: Multi-objective Bayesian optimization with tied multivariate ranks [33.414682601242006]
In this paper, we show a natural connection between non-dominated solutions and the extreme quantile of the joint cumulative distribution function.
Motivated by this link, we propose the Pareto-compliant CDF indicator and the associated acquisition function, BOtied.
Our experiments on a variety of synthetic and real-world problems demonstrate that BOtied outperforms state-of-the-art MOBO acquisition functions.
arXiv Detail & Related papers (2023-06-01T04:50:06Z) - Bayesian Optimization for Function Compositions with Applications to
Dynamic Pricing [0.0]
We propose a practical BO method of function compositions where the form of the composition is known but the constituent functions are expensive to evaluate.
We demonstrate a novel application to dynamic pricing in revenue management when the underlying demand function is expensive to evaluate.
arXiv Detail & Related papers (2023-03-21T15:45:06Z) - A General Recipe for Likelihood-free Bayesian Optimization [115.82591413062546]
We propose likelihood-free BO (LFBO) to extend BO to a broader class of models and utilities.
LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model.
We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem.
arXiv Detail & Related papers (2022-06-27T03:55:27Z) - $\pi$BO: Augmenting Acquisition Functions with User Beliefs for Bayesian
Optimization [40.30019289383378]
We propose $pi$BO, an acquisition function generalization which incorporates prior beliefs about the location of the optimum.
In contrast to previous approaches, $pi$BO is conceptually simple and can easily be integrated with existing libraries and many acquisition functions.
We also demonstrate that $pi$BO improves on the state-of-the-art performance for a popular deep learning task, with a 12.5 $times$ time-to-accuracy speedup over prominent BO approaches.
arXiv Detail & Related papers (2022-04-23T11:07:13Z) - What Makes an Effective Scalarising Function for Multi-Objective
Bayesian Optimisation? [6.4239565988811425]
Multi-objective Bayesian optimisation by scalarising the objectives avoids the computation of expensive integral-based acquisition functions.
The effectiveness of all the methodological improvements defined here is demonstrated on a real-world problem.
arXiv Detail & Related papers (2021-04-10T15:18:58Z) - Are we Forgetting about Compositional Optimisers in Bayesian
Optimisation? [66.39551991177542]
This paper presents a sample methodology for global optimisation.
Within this, a crucial performance-determiningtrivial is maximising the acquisition function.
We highlight the empirical advantages of the approach to optimise functionation across 3958 individual experiments.
arXiv Detail & Related papers (2020-12-15T12:18:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.