Think Global and Act Local: Bayesian Optimisation over High-Dimensional
Categorical and Mixed Search Spaces
- URL: http://arxiv.org/abs/2102.07188v1
- Date: Sun, 14 Feb 2021 16:18:36 GMT
- Title: Think Global and Act Local: Bayesian Optimisation over High-Dimensional
Categorical and Mixed Search Spaces
- Authors: Xingchen Wan, Vu Nguyen, Huong Ha, Binxin Ru, Cong Lu, Michael A.
Osborne
- Abstract summary: High-dimensional black-box optimisation remains an important yet notoriously challenging problem.
We propose a novel solution -- we combine local optimisation with a tailored kernel design, effectively handling high-dimensional categorical and mixed search spaces.
- Score: 26.08218231365666
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: High-dimensional black-box optimisation remains an important yet notoriously
challenging problem. Despite the success of Bayesian optimisation methods on
continuous domains, domains that are categorical, or that mix continuous and
categorical variables, remain challenging. We propose a novel solution -- we
combine local optimisation with a tailored kernel design, effectively handling
high-dimensional categorical and mixed search spaces, whilst retaining sample
efficiency. We further derive convergence guarantee for the proposed approach.
Finally, we demonstrate empirically that our method outperforms the current
baselines on a variety of synthetic and real-world tasks in terms of
performance, computational costs, or both.
Related papers
- A survey and benchmark of high-dimensional Bayesian optimization of discrete sequences [12.248793682283964]
optimizing discrete black-box functions is key in several domains, e.g. protein engineering and drug design.
We develop a unified framework to test a vast array of high-dimensional Bayesian optimization methods and a collection of standardized black-box functions.
These two components of the benchmark are each supported by flexible, scalable, and easily extendable software libraries.
arXiv Detail & Related papers (2024-06-07T08:39:40Z) - Localized Zeroth-Order Prompt Optimization [54.964765668688806]
We propose a novel algorithm, namely localized zeroth-order prompt optimization (ZOPO)
ZOPO incorporates a Neural Tangent Kernel-based derived Gaussian process into standard zeroth-order optimization for an efficient search of well-performing local optima in prompt optimization.
Remarkably, ZOPO outperforms existing baselines in terms of both the optimization performance and the query efficiency.
arXiv Detail & Related papers (2024-03-05T14:18:15Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - SOBER: Highly Parallel Bayesian Optimization and Bayesian Quadrature
over Discrete and Mixed Spaces [6.573393706476156]
We present a novel diversified global optimisation quadrature with arbitrary kernels over discrete and mixed spaces.
Batch quadrature can efficiently solve both tasks by balancing the merits of exploitative Bayesian quadrature.
We show that SOBER outperforms competitive baselinesefficient batch and scalable real-world tasks.
arXiv Detail & Related papers (2023-01-27T16:36:33Z) - Tree ensemble kernels for Bayesian optimization with known constraints
over mixed-feature spaces [54.58348769621782]
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search.
Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function.
Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.
arXiv Detail & Related papers (2022-07-02T16:59:37Z) - Bayesian Optimisation for Mixed-Variable Inputs using Value Proposals [10.40799693791025]
optimisation problems defined over both categorical and continuous variables.
We adopt a holistic view and aim to consolidate optimisation of the categorical and continuous sub-spaces.
We show that this unified approach significantly outperforms existing mixed-variable optimisation approaches.
arXiv Detail & Related papers (2022-02-10T04:42:48Z) - Optimization on manifolds: A symplectic approach [127.54402681305629]
We propose a dissipative extension of Dirac's theory of constrained Hamiltonian systems as a general framework for solving optimization problems.
Our class of (accelerated) algorithms are not only simple and efficient but also applicable to a broad range of contexts.
arXiv Detail & Related papers (2021-07-23T13:43:34Z) - Are we Forgetting about Compositional Optimisers in Bayesian
Optimisation? [66.39551991177542]
This paper presents a sample methodology for global optimisation.
Within this, a crucial performance-determiningtrivial is maximising the acquisition function.
We highlight the empirical advantages of the approach to optimise functionation across 3958 individual experiments.
arXiv Detail & Related papers (2020-12-15T12:18:38Z) - Adaptive Local Bayesian Optimization Over Multiple Discrete Variables [9.860437640748113]
This paper describes the approach of team KAIST OSI in a step-wise manner, which outperforms the baseline algorithms by up to +20.39%.
In a similar vein, we combine the methodology of Bayesian and multi-armed bandit,(MAB) approach to select the values with the consideration of the variable types.
Empirical evaluations demonstrate that our method outperforms the existing methods across different tasks.
arXiv Detail & Related papers (2020-12-07T07:51:23Z) - Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search
Spaces [63.22864716473051]
We propose a novel BO algorithm which expands (and shifts) the search space over iterations.
We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates.
arXiv Detail & Related papers (2020-09-05T14:24:40Z) - Upper Trust Bound Feasibility Criterion for Mixed Constrained Bayesian
Optimization with Application to Aircraft Design [41.74498230885008]
We adapt the so-called super effcient global optimization algorithm to solve more accurately mixed constrained problems.
We show the good potential of the approach on a set of numerical experiments.
arXiv Detail & Related papers (2020-05-11T12:59:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.