Mixtures of Gaussian Process Experts with SMC$^2$
- URL: http://arxiv.org/abs/2208.12830v1
- Date: Fri, 26 Aug 2022 18:20:14 GMT
- Title: Mixtures of Gaussian Process Experts with SMC$^2$
- Authors: Teemu H\"ark\"onen, Sara Wade, Kody Law, Lassi Roininen
- Abstract summary: mixtures of Gaussian process experts have been considered where data points are assigned to independent experts.
We construct a novel inference approach based on nested sequential Monte Carlo samplers to infer both the gating network and Gaussian process expert parameters.
- Score: 0.4588028371034407
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian processes are a key component of many flexible statistical and
machine learning models. However, they exhibit cubic computational complexity
and high memory constraints due to the need of inverting and storing a full
covariance matrix. To circumvent this, mixtures of Gaussian process experts
have been considered where data points are assigned to independent experts,
reducing the complexity by allowing inference based on smaller, local
covariance matrices. Moreover, mixtures of Gaussian process experts
substantially enrich the model's flexibility, allowing for behaviors such as
non-stationarity, heteroscedasticity, and discontinuities. In this work, we
construct a novel inference approach based on nested sequential Monte Carlo
samplers to simultaneously infer both the gating network and Gaussian process
expert parameters. This greatly improves inference compared to importance
sampling, particularly in settings when a stationary Gaussian process is
inappropriate, while still being thoroughly parallelizable.
Related papers
- Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Deep Gaussian Mixture Ensembles [9.673093148930874]
This work introduces a novel probabilistic deep learning technique called deep Gaussian mixture ensembles (DGMEs)
DGMEs are capable of approximating complex probability distributions, such as heavy-tailed or multimodal distributions.
Our experimental results demonstrate that DGMEs outperform state-of-the-art uncertainty quantifying deep learning models in handling complex predictive densities.
arXiv Detail & Related papers (2023-06-12T16:53:38Z) - Mixtures of Gaussian process experts based on kernel stick-breaking
processes [0.6396288020763143]
We propose a new mixture model of Gaussian process experts based on kernel stick-breaking processes.
Our model maintains the intuitive appeal yet improve the performance of the existing models.
The model behaviour and improved predictive performance are demonstrated in experiments using six datasets.
arXiv Detail & Related papers (2023-04-26T21:23:01Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Revisiting the Sample Complexity of Sparse Spectrum Approximation of
Gaussian Processes [60.479499225746295]
We introduce a new scalable approximation for Gaussian processes with provable guarantees which hold simultaneously over its entire parameter space.
Our approximation is obtained from an improved sample complexity analysis for sparse spectrum Gaussian processes (SSGPs)
arXiv Detail & Related papers (2020-11-17T05:41:50Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Sparse Gaussian Processes with Spherical Harmonic Features [14.72311048788194]
We introduce a new class of inter-domain variational Gaussian processes (GP)
Our inference scheme is comparable to variational Fourier features, but it does not suffer from the curse of dimensionality.
Our experiments show that our model is able to fit a regression model for a dataset with 6 million entries two orders of magnitude faster.
arXiv Detail & Related papers (2020-06-30T10:19:32Z) - Beyond the Mean-Field: Structured Deep Gaussian Processes Improve the
Predictive Uncertainties [12.068153197381575]
We propose a novel variational family that allows for retaining covariances between latent processes while achieving fast convergence.
We provide an efficient implementation of our new approach and apply it to several benchmark datasets.
It yields excellent results and strikes a better balance between accuracy and calibrated uncertainty estimates than its state-of-the-art alternatives.
arXiv Detail & Related papers (2020-05-22T11:10:59Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.