Gaussian Process Sampling and Optimization with Approximate Upper and
Lower Bounds
- URL: http://arxiv.org/abs/2110.12087v1
- Date: Fri, 22 Oct 2021 22:35:57 GMT
- Title: Gaussian Process Sampling and Optimization with Approximate Upper and
Lower Bounds
- Authors: Vu Nguyen, Marc Peter Deisenroth, Michael A. Osborne
- Abstract summary: Many functions have approximately-known upper and/or lower bounds, potentially aiding the modeling of such functions.
We introduce Gaussian process models for functions where such bounds are (approximately) known.
That is, we transform a GP model satisfying the given bounds, and then sample and weight functions from its posterior.
- Score: 43.70206216468687
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many functions have approximately-known upper and/or lower bounds,
potentially aiding the modeling of such functions. In this paper, we introduce
Gaussian process models for functions where such bounds are (approximately)
known. More specifically, we propose the first use of such bounds to improve
Gaussian process (GP) posterior sampling and Bayesian optimization (BO). That
is, we transform a GP model satisfying the given bounds, and then sample and
weight functions from its posterior. To further exploit these bounds in BO
settings, we present bounded entropy search (BES) to select the point gaining
the most information about the underlying function, estimated by the GP
samples, while satisfying the output constraints. We characterize the sample
variance bounds and show that the decision made by BES is explainable. Our
proposed approach is conceptually straightforward and can be used as a plug in
extension to existing methods for GP posterior sampling and Bayesian
optimization.
Related papers
- Fast post-process Bayesian inference with Variational Sparse Bayesian Quadrature [13.36200518068162]
We propose the framework of post-process Bayesian inference as a means to obtain a quick posterior approximation from existing target density evaluations.
Within this framework, we introduce Variational Sparse Bayesian Quadrature (VSBQ), a method for post-process approximate inference for models with black-box and potentially noisy likelihoods.
We validate our method on challenging synthetic scenarios and real-world applications from computational neuroscience.
arXiv Detail & Related papers (2023-03-09T13:58:35Z) - Relaxed Gaussian process interpolation: a goal-oriented approach to
Bayesian optimization [0.0]
This work presents a new procedure for obtaining predictive distributions in the context of Gaussian process (GP) modeling.
The method called relaxed Gaussian process (reGP) provides better predictive distributions in ranges of interest.
It can be viewed as a goal-oriented method and becomes particularly interesting in Bayesian optimization.
arXiv Detail & Related papers (2022-06-07T06:26:46Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Scalable Bayesian Optimization Using Vecchia Approximations of Gaussian
Processes [0.0]
We adapt the Vecchia approximation, a popular GP approximation from spatial statistics, to enable scalable high-dimensional Bayesian optimization.
We focus on the use of our warped Vecchia GP in trust-region Bayesian optimization via Thompson sampling.
arXiv Detail & Related papers (2022-03-02T23:55:14Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Preferential Bayesian optimisation with Skew Gaussian Processes [0.225596179391365]
We show that the true posterior distribution of the preference function is a Skew Gaussian Process (SkewGP)
We derive an efficient method to compute the exact SkewGP posterior and use it as surrogate model for PBO employing standard acquisition functions.
We also show that our framework can be extended to deal with mixed preferential-categorical BO.
arXiv Detail & Related papers (2020-08-15T08:23:17Z) - Randomised Gaussian Process Upper Confidence Bound for Bayesian
Optimisation [60.93091603232817]
We develop a modified Gaussian process upper confidence bound (GP-UCB) acquisition function.
This is done by sampling the exploration-exploitation trade-off parameter from a distribution.
We prove that this allows the expected trade-off parameter to be altered to better suit the problem without compromising a bound on the function's Bayesian regret.
arXiv Detail & Related papers (2020-06-08T00:28:41Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.