Locally Smoothed Gaussian Process Regression
- URL: http://arxiv.org/abs/2210.09998v1
- Date: Tue, 18 Oct 2022 17:04:35 GMT
- Title: Locally Smoothed Gaussian Process Regression
- Authors: Davit Gogolashvili, Bogdan Kozyrskiy, Maurizio Filippone
- Abstract summary: We develop a novel framework to accelerate Gaussian process regression (GPR)
In particular, we consider localization kernels at each data point to down-weigh the contributions from other data points that are far away.
We demonstrate the competitive performance of the proposed approach compared to full GPR, other localized models, and deep Gaussian processes.
- Score: 11.45660271015251
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop a novel framework to accelerate Gaussian process regression (GPR).
In particular, we consider localization kernels at each data point to
down-weigh the contributions from other data points that are far away, and we
derive the GPR model stemming from the application of such localization
operation. Through a set of experiments, we demonstrate the competitive
performance of the proposed approach compared to full GPR, other localized
models, and deep Gaussian processes. Crucially, these performances are obtained
with considerable speedups compared to standard global GPR due to the
sparsification effect of the Gram matrix induced by the localization operation.
Related papers
- Gaussian Primitives for Deformable Image Registration [9.184092856125067]
Experimental results on brain MRI, lung CT, and cardiac MRI datasets demonstrate that GaussianDIR outperforms existing DIR methods in both accuracy and efficiency.
As a training-free approach, it challenges the stereotype that iterative methods are inherently slow and transcend the limitations of poor generalization.
arXiv Detail & Related papers (2024-06-05T15:44:54Z) - Sparse Variational Contaminated Noise Gaussian Process Regression with Applications in Geomagnetic Perturbations Forecasting [4.675221539472143]
We propose a scalable inference algorithm for fitting sparse Gaussian process regression models with contaminated normal noise on large datasets.
We show that our approach yields shorter prediction intervals for similar coverage and accuracy when compared to an artificial dense neural network baseline.
arXiv Detail & Related papers (2024-02-27T15:08:57Z) - Self-Distillation for Gaussian Process Regression and Classification [0.0]
We propose two approaches to extend the notion of knowledge distillation to Gaussian Process Regression (GPR) and Gaussian Process Classification (GPC)
The data-centric approach resembles most current distillation techniques for machine learning, and refits a model on deterministic predictions from the teacher.
The distribution-centric approach re-uses the full probabilistic posterior for the next iteration.
arXiv Detail & Related papers (2023-04-05T17:59:20Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Local optimization on pure Gaussian state manifolds [63.76263875368856]
We exploit insights into the geometry of bosonic and fermionic Gaussian states to develop an efficient local optimization algorithm.
The method is based on notions of descent gradient attuned to the local geometry.
We use the presented methods to collect numerical and analytical evidence for the conjecture that Gaussian purifications are sufficient to compute the entanglement of purification of arbitrary mixed Gaussian states.
arXiv Detail & Related papers (2020-09-24T18:00:36Z) - Locally induced Gaussian processes for large-scale simulation
experiments [0.0]
We show how placement of inducing points and their multitude can be thwarted by pathologies.
Our proposed methodology hybridizes global inducing point and data subset-based local GP approximation.
We show that local inducing points extend their global and data-subset component parts on the accuracy--computational efficiency frontier.
arXiv Detail & Related papers (2020-08-28T21:37:46Z) - Gaussian Process Regression with Local Explanation [28.90948136731314]
We propose GPR with local explanation, which reveals the feature contributions to the prediction of each sample.
In the proposed model, both the prediction and explanation for each sample are performed using an easy-to-interpret locally linear model.
For a new test sample, the proposed model can predict the values of its target variable and weight vector, as well as their uncertainties.
arXiv Detail & Related papers (2020-07-03T13:22:24Z) - Likelihood-Free Inference with Deep Gaussian Processes [70.74203794847344]
Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.
We propose a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions.
Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases.
arXiv Detail & Related papers (2020-06-18T14:24:05Z) - Real-Time Regression with Dividing Local Gaussian Processes [62.01822866877782]
Local Gaussian processes are a novel, computationally efficient modeling approach based on Gaussian process regression.
Due to an iterative, data-driven division of the input space, they achieve a sublinear computational complexity in the total number of training points in practice.
A numerical evaluation on real-world data sets shows their advantages over other state-of-the-art methods in terms of accuracy as well as prediction and update speed.
arXiv Detail & Related papers (2020-06-16T18:43:31Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.