Wasserstein-Splitting Gaussian Process Regression for Heterogeneous
Online Bayesian Inference
- URL: http://arxiv.org/abs/2107.12797v1
- Date: Mon, 26 Jul 2021 17:52:46 GMT
- Title: Wasserstein-Splitting Gaussian Process Regression for Heterogeneous
Online Bayesian Inference
- Authors: Michael E. Kepler, Alec Koppel, Amrit Singh Bedi, and Daniel J.
Stilwell
- Abstract summary: We employ variational free energy approximations of GPs operating in tandem with online expectation propagation steps.
We introduce a local splitting step which instantiates a new GP whenever the posterior distribution changes significantly.
Over time, this yields an ensemble of sparse GPs which may be updated incrementally.
- Score: 9.7471390457395
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian processes (GPs) are a well-known nonparametric Bayesian inference
technique, but they suffer from scalability problems for large sample sizes,
and their performance can degrade for non-stationary or spatially heterogeneous
data. In this work, we seek to overcome these issues through (i) employing
variational free energy approximations of GPs operating in tandem with online
expectation propagation steps; and (ii) introducing a local splitting step
which instantiates a new GP whenever the posterior distribution changes
significantly as quantified by the Wasserstein metric over posterior
distributions. Over time, then, this yields an ensemble of sparse GPs which may
be updated incrementally, and adapts to locality, heterogeneity, and
non-stationarity in training data.
Related papers
- Theoretical Analysis of Heteroscedastic Gaussian Processes with Posterior Distributions [0.4895118383237099]
This study introduces a novel theoretical framework for analyzing heteroscedastic Gaussian processes (HGPs)
It derives the exact means, variances, and cumulative distributions of the posterior distributions.
The derived theoretical findings are applied to a chance-constrained tracking controller.
arXiv Detail & Related papers (2024-09-19T09:51:46Z) - Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Deep Horseshoe Gaussian Processes [1.0742675209112622]
We introduce the deep Horseshoe Gaussian process Deep-HGP, a new simple prior based on deep Gaussian processes with a squared-exponential kernel.
We show that the associated tempered posterior distribution recovers the unknown true regression curve optimally in terms of quadratic loss, up to a logarithmic factor.
arXiv Detail & Related papers (2024-03-04T05:30:43Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Heterogeneous Multi-Task Gaussian Cox Processes [61.67344039414193]
We present a novel extension of multi-task Gaussian Cox processes for modeling heterogeneous correlated tasks jointly.
A MOGP prior over the parameters of the dedicated likelihoods for classification, regression and point process tasks can facilitate sharing of information between heterogeneous tasks.
We derive a mean-field approximation to realize closed-form iterative updates for estimating model parameters.
arXiv Detail & Related papers (2023-08-29T15:01:01Z) - Non-Gaussian Process Regression [0.0]
We extend the GP framework into a new class of time-changed GPs that allow for straightforward modelling of heavy-tailed non-Gaussian behaviours.
We present Markov chain Monte Carlo inference procedures for this model and demonstrate the potential benefits.
arXiv Detail & Related papers (2022-09-07T13:08:22Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Minibatch vs Local SGD with Shuffling: Tight Convergence Bounds and
Beyond [63.59034509960994]
We study shuffling-based variants: minibatch and local Random Reshuffling, which draw gradients without replacement.
For smooth functions satisfying the Polyak-Lojasiewicz condition, we obtain convergence bounds which show that these shuffling-based variants converge faster than their with-replacement counterparts.
We propose an algorithmic modification called synchronized shuffling that leads to convergence rates faster than our lower bounds in near-homogeneous settings.
arXiv Detail & Related papers (2021-10-20T02:25:25Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Deep Sigma Point Processes [22.5396672566053]
We introduce a class of parametric models inspired by the compositional structure of Deep Gaussian Processes (DGPs)
Deep Sigma Point Processes (DSPPs) retain many of the attractive features of (variational) DGPs, including mini-batch training and predictive uncertainty that is controlled by kernel basis functions.
arXiv Detail & Related papers (2020-02-21T03:40:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.