Gaussian Process regression over discrete probability measures: on the
non-stationarity relation between Euclidean and Wasserstein Squared
Exponential Kernels
- URL: http://arxiv.org/abs/2212.01310v1
- Date: Fri, 2 Dec 2022 17:09:52 GMT
- Title: Gaussian Process regression over discrete probability measures: on the
non-stationarity relation between Euclidean and Wasserstein Squared
Exponential Kernels
- Authors: Antonio Candelieri, Andrea Ponti, Francesco Archetti
- Abstract summary: A non-stationarity relationship between the Wasserstein-based squared exponential kernel and its Euclidean-based counterpart is studied.
A transformation is used to transform the input space as Euclidean into a non-stationary and Wasserstein-based Gaussian Process model.
- Score: 0.19116784879310028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian Process regression is a kernel method successfully adopted in many
real-life applications. Recently, there is a growing interest on extending this
method to non-Euclidean input spaces, like the one considered in this paper,
consisting of probability measures. Although a Positive Definite kernel can be
defined by using a suitable distance -- the Wasserstein distance -- the common
procedure for learning the Gaussian Process model can fail due to numerical
issues, arising earlier and more frequently than in the case of an Euclidean
input space and, as demonstrated in this paper, that cannot be avoided by
adding artificial noise (nugget effect) as usually done. This paper uncovers
the main reason of these issues, that is a non-stationarity relationship
between the Wasserstein-based squared exponential kernel and its
Euclidean-based counterpart. As a relevant result, the Gaussian Process model
is learned by assuming the input space as Euclidean and then an algebraic
transformation, based on the uncovered relation, is used to transform it into a
non-stationary and Wasserstein-based Gaussian Process model over probability
measures. This algebraic transformation is simpler than log-exp maps used in
the case of data belonging to Riemannian manifolds and recently extended to
consider the pseudo-Riemannian structure of an input space equipped with the
Wasserstein distance.
Related papers
- Intrinsic Gaussian Process Regression Modeling for Manifold-valued Response Variable [6.137918306133745]
We propose a novel intrinsic Gaussian process regression model for manifold-valued data.
We establish the properties of the proposed models, including information consistency and posterior consistency.
Numerical studies, including simulation and real examples, indicate that the proposed methods work well.
arXiv Detail & Related papers (2024-11-28T08:27:59Z) - Score matching for sub-Riemannian bridge sampling [2.048226951354646]
Recent progress in machine learning can be modified to allow training of score approximators on sub-Riemannian gradients.
We perform numerical experiments exemplifying samples from the bridge process on the Heisenberg group and the concentration of this process for small time.
arXiv Detail & Related papers (2024-04-23T17:45:53Z) - Deep Horseshoe Gaussian Processes [0.0]
We introduce the deep Horseshoe Gaussian process Deep-HGP, a new simple prior based on deep Gaussian processes with a squared-exponential kernel.
For nonparametric regression with random design, we show that the associated posterior distribution recovers the unknown true regression curve in terms of quadratic loss.
The convergence rates are simultaneously adaptive to both the smoothness of the regression function and to its structure in terms of compositions.
arXiv Detail & Related papers (2024-03-04T05:30:43Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - PDE-constrained Gaussian process surrogate modeling with uncertain data locations [1.943678022072958]
We propose a Bayesian approach that integrates the variability of input data into the Gaussian process regression for function and partial differential equation approximation.
A consistently good performance of generalization is observed, and a substantial reduction in the predictive uncertainties is achieved.
arXiv Detail & Related papers (2023-05-19T10:53:08Z) - Posterior and Computational Uncertainty in Gaussian Processes [52.26904059556759]
Gaussian processes scale prohibitively with the size of the dataset.
Many approximation methods have been developed, which inevitably introduce approximation error.
This additional source of uncertainty, due to limited computation, is entirely ignored when using the approximate posterior.
We develop a new class of methods that provides consistent estimation of the combined uncertainty arising from both the finite number of data observed and the finite amount of computation expended.
arXiv Detail & Related papers (2022-05-30T22:16:25Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - A Kernel-Based Approach for Modelling Gaussian Processes with Functional
Information [0.0]
We use a Gaussian process model to unify the typical finite case with the case of uncountable information.
We discuss this construction in statistical models, including numerical considerations and a proof of concept.
arXiv Detail & Related papers (2022-01-26T15:58:08Z) - An application of the splitting-up method for the computation of a
neural network representation for the solution for the filtering equations [68.8204255655161]
Filtering equations play a central role in many real-life applications, including numerical weather prediction, finance and engineering.
One of the classical approaches to approximate the solution of the filtering equations is to use a PDE inspired method, called the splitting-up method.
We combine this method with a neural network representation to produce an approximation of the unnormalised conditional distribution of the signal process.
arXiv Detail & Related papers (2022-01-10T11:01:36Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.