BOOST: Bayesian Optimization with Optimal Kernel and Acquisition Function Selection Technique
- URL: http://arxiv.org/abs/2508.02332v2
- Date: Tue, 30 Sep 2025 11:33:27 GMT
- Title: BOOST: Bayesian Optimization with Optimal Kernel and Acquisition Function Selection Technique
- Authors: Joon-Hyun Park, Mujin Cheon, Jeongsu Wi, Dong-Yeun Koh,
- Abstract summary: We propose BOOST, a novel framework that automates the selection of kernel-acquisition pairs.<n>BOOST predicts the performance of various kernel-acquisition pairs and identifies the most promising pair before committing to expensive evaluations.
- Score: 1.0499611180329804
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The performance of Bayesian optimization (BO), a highly sample-efficient method for expensive black-box problems, is critically governed by the selection of its hyperparameters, including the kernel and acquisition functions. This presents a significant practical challenge: an inappropriate combination can lead to poor performance and wasted evaluations. While individual improvements to kernel functions (e.g., tree-based kernels, deep kernel learning) and acquisition functions (e.g., multi-step lookahead, tree-based planning) have been actively explored, the joint and autonomous selection of the best pair has been largely overlooked, forcing practitioners to rely on heuristics or costly manual tuning. We propose BOOST (Bayesian Optimization with Optimal Kernel and Acquisition Function Selection Technique), a novel framework that automates this selection. BOOST utilizes a lightweight, offline evaluation stage to predict the performance of various kernel-acquisition pairs and identify the most promising pair before committing to expensive evaluations. Using K-means clustering, BOOST first selects initial subsets from previously observed data-in-hand and prepares all possible kernel-acquisition pairs from user-chosen candidates. For each pair, BOOST conducts internal BO runs starting with the initial subset, evaluating how many iterations are required to find the target value within the remaining data, thereby identifying the pair with the best retrospective performance for future optimization. Experiments on synthetic benchmarks and real-world hyperparameter optimization tasks demonstrate that BOOST consistently outperforms standard BO with fixed hyperparameters and state-of-the-art adaptive methods, highlighting its effectiveness and robustness in diverse problem landscapes.
Related papers
- None To Optima in Few Shots: Bayesian Optimization with MDP Priors [40.4319486959011]
We introduce the Procedure-inFormed BO (ProfBO) algorithm, which solves black-box optimization with remarkably few function evaluations.<n>ProfBO consistently outperforms state-of-the-art methods by achieving high-quality tuning solutions with significantly fewer evaluations.
arXiv Detail & Related papers (2025-11-02T16:53:17Z) - Relation-Aware Bayesian Optimization of DBMS Configurations Guided by Affinity Scores [2.474203056060563]
Database Management Systems (DBMSs) are fundamental for managing large-scale and heterogeneous data, and their performance is critically influenced by configuration parameters.<n>Recent research has focused on automated configuration optimization using machine learning; however, existing approaches still exhibit several key limitations.<n>We propose RelTune, a novel framework that represents parameter dependencies as a Graph and learns GNN-based latent embeddings that encode performancerelevant semantics.
arXiv Detail & Related papers (2025-10-31T03:46:42Z) - Cost-Sensitive Multi-Fidelity Bayesian Optimization with Transfer of Learning Curve Extrapolation [55.75188191403343]
We introduce utility, which is a function predefined by each user and describes the trade-off between cost and performance of BO.
We validate our algorithm on various LC datasets and found it outperform all the previous multi-fidelity BO and transfer-BO baselines we consider.
arXiv Detail & Related papers (2024-05-28T07:38:39Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - A General Framework for User-Guided Bayesian Optimization [51.96352579696041]
We propose ColaBO, the first Bayesian-principled framework for prior beliefs beyond the typical kernel structure.
We empirically demonstrate ColaBO's ability to substantially accelerate optimization when the prior information is accurate, and to retain approximately default performance when it is misleading.
arXiv Detail & Related papers (2023-11-24T18:27:26Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - Pre-training helps Bayesian optimization too [49.28382118032923]
We seek an alternative practice for setting functional priors.
In particular, we consider the scenario where we have data from similar functions that allow us to pre-train a tighter distribution a priori.
Our results show that our method is able to locate good hyper parameters at least 3 times more efficiently than the best competing methods.
arXiv Detail & Related papers (2022-07-07T04:42:54Z) - Tree ensemble kernels for Bayesian optimization with known constraints
over mixed-feature spaces [54.58348769621782]
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search.
Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function.
Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.
arXiv Detail & Related papers (2022-07-02T16:59:37Z) - A Comparative study of Hyper-Parameter Optimization Tools [2.6097538974670935]
We compare the performance of four python libraries, namely Optuna, Hyperopt, Optunity, and sequential model algorithm configuration (SMAC)
We found that Optuna has better performance for CASH problem and NeurIPS black-box optimization challenge.
arXiv Detail & Related papers (2022-01-17T14:49:36Z) - Bayesian Optimization over Permutation Spaces [30.650753803587794]
We propose and evaluate two algorithms for BO over Permutation Spaces (BOPS)
We theoretically analyze the performance of BOPS-T to show that their regret grows sub-linearly.
Our experiments on multiple synthetic and real-world benchmarks show that both BOPS-T and BOPS-H perform better than the state-of-the-art BO algorithm for spaces.
arXiv Detail & Related papers (2021-12-02T08:20:50Z) - Bayesian Optimization for auto-tuning GPU kernels [0.0]
Finding optimal parameter configurations for GPU kernels is a non-trivial exercise for large search spaces, even when automated.
We introduce a novel contextual exploration factor as well as new acquisition functions with improved scalability, combined with an informed function selection mechanism.
arXiv Detail & Related papers (2021-11-26T11:26:26Z) - Pre-trained Gaussian Processes for Bayesian Optimization [24.730678780782647]
We propose a new pre-training based BO framework named HyperBO.
We show bounded posterior predictions and near-zero regrets for HyperBO without assuming the "ground truth" GP prior is known.
arXiv Detail & Related papers (2021-09-16T20:46:26Z) - Revisiting Bayesian Optimization in the light of the COCO benchmark [1.4467794332678539]
This article reports a large investigation about the effects on the performance of (Gaussian process based) BO of common and less common design choices.
The code developed for this study makes the new version (v2.1.1) of the R package DiceOptim available on CRAN.
arXiv Detail & Related papers (2021-03-30T19:45:18Z) - BOSH: Bayesian Optimization by Sampling Hierarchically [10.10241176664951]
We propose a novel BO routine pairing a hierarchical Gaussian process with an information-theoretic framework to generate a growing pool of realizations.
We demonstrate that BOSH provides more efficient and higher-precision optimization than standard BO across synthetic benchmarks, simulation optimization, reinforcement learning and hyper- parameter tuning tasks.
arXiv Detail & Related papers (2020-07-02T07:35:49Z) - Incorporating Expert Prior in Bayesian Optimisation via Space Warping [54.412024556499254]
In big search spaces the algorithm goes through several low function value regions before reaching the optimum of the function.
One approach to subside this cold start phase is to use prior knowledge that can accelerate the optimisation.
In this paper, we represent the prior knowledge about the function optimum through a prior distribution.
The prior distribution is then used to warp the search space in such a way that space gets expanded around the high probability region of function optimum and shrinks around low probability region of optimum.
arXiv Detail & Related papers (2020-03-27T06:18:49Z) - Towards Automatic Bayesian Optimization: A first step involving
acquisition functions [0.0]
Bayesian optimization is the state of the art technique for the optimization of black boxes, i.e., functions where we do not have access to their analytical expression.
We propose a first attempt over automatic bayesian optimization by exploring several techniques that automatically tune the acquisition function.
arXiv Detail & Related papers (2020-03-21T12:22:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.