CobBO: Coordinate Backoff Bayesian Optimization
- URL: http://arxiv.org/abs/2101.05147v2
- Date: Tue, 16 Feb 2021 10:18:57 GMT
- Title: CobBO: Coordinate Backoff Bayesian Optimization
- Authors: Jian Tan, Niv Nayman, Mengchang Wang, Feifei Li, Rong Jin
- Abstract summary: We introduce Coordinate Backoff Bayesian Optimization (CobBO) to capture a smooth approximation of the global landscape.
CobBO finds solutions comparable to or better than other state-of-the-art methods for dimensions ranging from tens to hundreds.
- Score: 45.53400129323848
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian optimization is a popular method for optimizing expensive black-box
functions. The objective functions of hard real world problems are oftentimes
characterized by a fluctuated landscape of many local optima. Bayesian
optimization risks in over-exploiting such traps, remaining with insufficient
query budget for exploring the global landscape. We introduce Coordinate
Backoff Bayesian Optimization (CobBO) to alleviate those challenges. CobBO
captures a smooth approximation of the global landscape by interpolating the
values of queried points projected to randomly selected promising subspaces.
Thus also a smaller query budget is required for the Gaussian process
regressions applied over the lower dimensional subspaces. This approach can be
viewed as a variant of coordinate ascent, tailored for Bayesian optimization,
using a stopping rule for backing off from a certain subspace and switching to
another coordinate subset. Extensive evaluations show that CobBO finds
solutions comparable to or better than other state-of-the-art methods for
dimensions ranging from tens to hundreds, while reducing the trial complexity.
Related papers
- High-dimensional Bayesian Optimization via Covariance Matrix Adaptation
Strategy [16.521207412129833]
We propose a novel technique for defining the local regions using the Covariance Matrix Adaptation (CMA) strategy.
Based on this search distribution, we then define the local regions consisting of data points with high probabilities of being the global optimum.
Our approach serves as a meta-algorithm as it can incorporate existing black-box BOs, such as BO, TuRBO, and BAxUS, to find the global optimum.
arXiv Detail & Related papers (2024-02-05T15:32:10Z) - LABCAT: Locally adaptive Bayesian optimization using principal-component-aligned trust regions [0.0]
We propose the LABCAT algorithm, which extends trust-region-based BO.
We show that the algorithm outperforms several state-of-the-art BO and other black-box optimization algorithms.
arXiv Detail & Related papers (2023-11-19T13:56:24Z) - Advancing Bayesian Optimization via Learning Correlated Latent Space [15.783344085533187]
We propose Correlated latent space Bayesian Optimization (CoBO), which focuses on learning correlated latent spaces.
Specifically, our method introduces Lipschitz regularization, loss weighting, and trust region recoordination to minimize the inherent gap around the promising areas.
We demonstrate the effectiveness of our approach on several optimization tasks in discrete data, such as molecule design and arithmetic expression fitting.
arXiv Detail & Related papers (2023-10-31T08:24:41Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - SOBER: Highly Parallel Bayesian Optimization and Bayesian Quadrature
over Discrete and Mixed Spaces [6.573393706476156]
We present a novel diversified global optimisation quadrature with arbitrary kernels over discrete and mixed spaces.
Batch quadrature can efficiently solve both tasks by balancing the merits of exploitative Bayesian quadrature.
We show that SOBER outperforms competitive baselinesefficient batch and scalable real-world tasks.
arXiv Detail & Related papers (2023-01-27T16:36:33Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search
Spaces [63.22864716473051]
We propose a novel BO algorithm which expands (and shifts) the search space over iterations.
We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates.
arXiv Detail & Related papers (2020-09-05T14:24:40Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - Incorporating Expert Prior in Bayesian Optimisation via Space Warping [54.412024556499254]
In big search spaces the algorithm goes through several low function value regions before reaching the optimum of the function.
One approach to subside this cold start phase is to use prior knowledge that can accelerate the optimisation.
In this paper, we represent the prior knowledge about the function optimum through a prior distribution.
The prior distribution is then used to warp the search space in such a way that space gets expanded around the high probability region of function optimum and shrinks around low probability region of optimum.
arXiv Detail & Related papers (2020-03-27T06:18:49Z) - Scalable Constrained Bayesian Optimization [10.820024633762596]
The global optimization of a high-dimensional black-box function under black-box constraints is a pervasive task in machine learning, control, and the scientific community.
We propose the scalable constrained Bayesian optimization (SCBO) algorithm that overcomes the above challenges and pushes the state-the-art.
arXiv Detail & Related papers (2020-02-20T01:48:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.