Transforming Gaussian Processes With Normalizing Flows
- URL: http://arxiv.org/abs/2011.01596v2
- Date: Thu, 25 Feb 2021 17:19:34 GMT
- Title: Transforming Gaussian Processes With Normalizing Flows
- Authors: Juan Maro\~nas, Oliver Hamelijnck, Jeremias Knoblauch, Theodoros
Damoulas
- Abstract summary: We show that a parametric invertible transformation can be made input-dependent and encode interpretable prior knowledge.
We derive a variational approximation to the resulting inference problem, which is as fast as variational GP regression.
The resulting algorithm's computational and inferential performance is excellent, and we demonstrate this on a range of data sets.
- Score: 15.886048234706633
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian Processes (GPs) can be used as flexible, non-parametric function
priors. Inspired by the growing body of work on Normalizing Flows, we enlarge
this class of priors through a parametric invertible transformation that can be
made input-dependent. Doing so also allows us to encode interpretable prior
knowledge (e.g., boundedness constraints). We derive a variational
approximation to the resulting Bayesian inference problem, which is as fast as
stochastic variational GP regression (Hensman et al., 2013; Dezfouli and
Bonilla,2015). This makes the model a computationally efficient alternative to
other hierarchical extensions of GP priors (Lazaro-Gredilla,2012; Damianou and
Lawrence, 2013). The resulting algorithm's computational and inferential
performance is excellent, and we demonstrate this on a range of data sets. For
example, even with only 5 inducing points and an input-dependent flow, our
method is consistently competitive with a standard sparse GP fitted using 100
inducing points.
Related papers
- Shallow and Deep Nonparametric Convolutions for Gaussian Processes [0.0]
We introduce a nonparametric process convolution formulation for GPs that alleviates weaknesses by using a functional sampling approach.
We propose a composition of these nonparametric convolutions that serves as an alternative to classic deep GP models.
arXiv Detail & Related papers (2022-06-17T19:03:04Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Scaling Gaussian Process Optimization by Evaluating a Few Unique
Candidates Multiple Times [119.41129787351092]
We show that sequential black-box optimization based on GPs can be made efficient by sticking to a candidate solution for multiple evaluation steps.
We modify two well-established GP-Opt algorithms, GP-UCB and GP-EI to adapt rules from batched GP-Opt.
arXiv Detail & Related papers (2022-01-30T20:42:14Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - MuyGPs: Scalable Gaussian Process Hyperparameter Estimation Using Local
Cross-Validation [1.2233362977312945]
We present MuyGPs, a novel efficient GP hyper parameter estimation method.
MuyGPs builds upon prior methods that take advantage of the nearest neighbors structure of the data.
We show that our method outperforms all known competitors both in terms of time-to-solution and the root mean squared error of the predictions.
arXiv Detail & Related papers (2021-04-29T18:10:21Z) - Scalable Gaussian Process Variational Autoencoders [17.345687261000045]
We propose a new scalable GP-VAE model that outperforms existing approaches in terms of runtime and memory footprint, is easy to implement, and allows for joint end-to-end optimization of all components.
arXiv Detail & Related papers (2020-10-26T10:26:02Z) - Likelihood-Free Inference with Deep Gaussian Processes [70.74203794847344]
Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.
We propose a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions.
Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases.
arXiv Detail & Related papers (2020-06-18T14:24:05Z) - Sparse Gaussian Processes Revisited: Bayesian Approaches to
Inducing-Variable Approximations [27.43948386608]
Variational inference techniques based on inducing variables provide an elegant framework for scalable estimation in Gaussian process (GP) models.
In this work we challenge the common wisdom that optimizing the inducing inputs in variational framework yields optimal performance.
arXiv Detail & Related papers (2020-03-06T08:53:18Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.