High Dimensional Bayesian Optimization with Kernel Principal Component
Analysis
- URL: http://arxiv.org/abs/2204.13753v1
- Date: Thu, 28 Apr 2022 20:09:02 GMT
- Title: High Dimensional Bayesian Optimization with Kernel Principal Component
Analysis
- Authors: Kirill Antonov, Elena Raponi, Hao Wang, Carola Doerr
- Abstract summary: kernel PCA-assisted BO (KPCA-BO) algorithm embeds a non-linear sub-manifold in the search space and performs BO on this sub-manifold.
We compare the performance of KPCA-BO to the vanilla BO and PCA-BO on the multi-modal problems of the COCO/BBOB benchmark suite.
- Score: 4.33419118449588
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian Optimization (BO) is a surrogate-based global optimization strategy
that relies on a Gaussian Process regression (GPR) model to approximate the
objective function and an acquisition function to suggest candidate points. It
is well-known that BO does not scale well for high-dimensional problems because
the GPR model requires substantially more data points to achieve sufficient
accuracy and acquisition optimization becomes computationally expensive in high
dimensions. Several recent works aim at addressing these issues, e.g., methods
that implement online variable selection or conduct the search on a
lower-dimensional sub-manifold of the original search space. Advancing our
previous work of PCA-BO that learns a linear sub-manifold, this paper proposes
a novel kernel PCA-assisted BO (KPCA-BO) algorithm, which embeds a non-linear
sub-manifold in the search space and performs BO on this sub-manifold.
Intuitively, constructing the GPR model on a lower-dimensional sub-manifold
helps improve the modeling accuracy without requiring much more data from the
objective function. Also, our approach defines the acquisition function on the
lower-dimensional sub-manifold, making the acquisition optimization more
manageable.
We compare the performance of KPCA-BO to the vanilla BO and PCA-BO on the
multi-modal problems of the COCO/BBOB benchmark suite. Empirical results show
that KPCA-BO outperforms BO in terms of convergence speed on most test
problems, and this benefit becomes more significant when the dimensionality
increases. For the 60D functions, KPCA-BO surpasses PCA-BO in many test cases.
Moreover, it efficiently reduces the CPU time required to train the GPR model
and optimize the acquisition function compared to the vanilla BO.
Related papers
- Approximation-Aware Bayesian Optimization [34.56666383247348]
High-dimensional Bayesian optimization (BO) tasks often require 10,000 function evaluations before obtaining meaningful results.
We modify sparse variational Gaussian processes (SVGPs) to better align with the goals of BO.
Using the framework of utility-calibrated variational inference, we unify GP approximation and data acquisition into a joint optimization problem.
arXiv Detail & Related papers (2024-06-06T17:55:02Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Predictive Modeling through Hyper-Bayesian Optimization [60.586813904500595]
We propose a novel way of integrating model selection and BO for the single goal of reaching the function optima faster.
The algorithm moves back and forth between BO in the model space and BO in the function space, where the goodness of the recommended model is captured.
In addition to improved sample efficiency, the framework outputs information about the black-box function.
arXiv Detail & Related papers (2023-08-01T04:46:58Z) - Model-based Causal Bayesian Optimization [78.120734120667]
We propose model-based causal Bayesian optimization (MCBO)
MCBO learns a full system model instead of only modeling intervention-reward pairs.
Unlike in standard Bayesian optimization, our acquisition function cannot be evaluated in closed form.
arXiv Detail & Related papers (2022-11-18T14:28:21Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - A General Recipe for Likelihood-free Bayesian Optimization [115.82591413062546]
We propose likelihood-free BO (LFBO) to extend BO to a broader class of models and utilities.
LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model.
We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem.
arXiv Detail & Related papers (2022-06-27T03:55:27Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - MBORE: Multi-objective Bayesian Optimisation by Density-Ratio Estimation [0.01652719262940403]
optimisation problems often have multiple conflicting objectives that can be computationally and/or financially expensive.
Mono-surrogate Bayesian optimisation (BO) is a popular model-based approach for optimising such black-box functions.
We extend previous work on BO by density-ratio estimation (BORE) to the multi-objective setting.
arXiv Detail & Related papers (2022-03-31T09:27:59Z) - Bayesian Optimization over Permutation Spaces [30.650753803587794]
We propose and evaluate two algorithms for BO over Permutation Spaces (BOPS)
We theoretically analyze the performance of BOPS-T to show that their regret grows sub-linearly.
Our experiments on multiple synthetic and real-world benchmarks show that both BOPS-T and BOPS-H perform better than the state-of-the-art BO algorithm for spaces.
arXiv Detail & Related papers (2021-12-02T08:20:50Z) - High-Dimensional Bayesian Optimization via Tree-Structured Additive
Models [40.497123136157946]
We consider generalized additive models in which low-dimensional functions with overlapping subsets of variables are composed to model a high-dimensional target function.
Our goal is to lower the computational resources required and facilitate faster model learning.
We demonstrate and discuss the efficacy of our approach via a range of experiments on synthetic functions and real-world datasets.
arXiv Detail & Related papers (2020-12-24T03:56:44Z) - High Dimensional Bayesian Optimization Assisted by Principal Component
Analysis [4.030481609048958]
We introduce a novel PCA-assisted BO (PCA-BO) algorithm for high-dimensional numerical optimization problems.
We show that PCA-BO can effectively reduce the CPU time incurred on high-dimensional problems, and maintains the convergence rate on problems with an adequate global structure.
arXiv Detail & Related papers (2020-07-02T07:03:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.