Gaussian Process regression over discrete probability measures: on the
non-stationarity relation between Euclidean and Wasserstein Squared
Exponential Kernels
- URL: http://arxiv.org/abs/2212.01310v1
- Date: Fri, 2 Dec 2022 17:09:52 GMT
- Title: Gaussian Process regression over discrete probability measures: on the
non-stationarity relation between Euclidean and Wasserstein Squared
Exponential Kernels
- Authors: Antonio Candelieri, Andrea Ponti, Francesco Archetti
- Abstract summary: A non-stationarity relationship between the Wasserstein-based squared exponential kernel and its Euclidean-based counterpart is studied.
A transformation is used to transform the input space as Euclidean into a non-stationary and Wasserstein-based Gaussian Process model.
- Score: 0.19116784879310028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian Process regression is a kernel method successfully adopted in many
real-life applications. Recently, there is a growing interest on extending this
method to non-Euclidean input spaces, like the one considered in this paper,
consisting of probability measures. Although a Positive Definite kernel can be
defined by using a suitable distance -- the Wasserstein distance -- the common
procedure for learning the Gaussian Process model can fail due to numerical
issues, arising earlier and more frequently than in the case of an Euclidean
input space and, as demonstrated in this paper, that cannot be avoided by
adding artificial noise (nugget effect) as usually done. This paper uncovers
the main reason of these issues, that is a non-stationarity relationship
between the Wasserstein-based squared exponential kernel and its
Euclidean-based counterpart. As a relevant result, the Gaussian Process model
is learned by assuming the input space as Euclidean and then an algebraic
transformation, based on the uncovered relation, is used to transform it into a
non-stationary and Wasserstein-based Gaussian Process model over probability
measures. This algebraic transformation is simpler than log-exp maps used in
the case of data belonging to Riemannian manifolds and recently extended to
consider the pseudo-Riemannian structure of an input space equipped with the
Wasserstein distance.
Related papers
- Score matching for sub-Riemannian bridge sampling [2.048226951354646]
Recent progress in machine learning can be modified to allow training of score approximators on sub-Riemannian gradients.
We perform numerical experiments exemplifying samples from the bridge process on the Heisenberg group and the concentration of this process for small time.
arXiv Detail & Related papers (2024-04-23T17:45:53Z) - Deep Horseshoe Gaussian Processes [1.0742675209112622]
We introduce the deep Horseshoe Gaussian process Deep-HGP, a new simple prior based on deep Gaussian processes with a squared-exponential kernel.
We show that the associated tempered posterior distribution recovers the unknown true regression curve optimally in terms of quadratic loss, up to a logarithmic factor.
arXiv Detail & Related papers (2024-03-04T05:30:43Z) - Gaussian Process Regression under Computational and Epistemic Misspecification [4.5656369638728656]
In large data applications, computational costs can be reduced using low-rank or sparse approximations of the kernel.
This paper investigates the effect of such kernel approximations on the element error.
arXiv Detail & Related papers (2023-12-14T18:53:32Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Posterior and Computational Uncertainty in Gaussian Processes [52.26904059556759]
Gaussian processes scale prohibitively with the size of the dataset.
Many approximation methods have been developed, which inevitably introduce approximation error.
This additional source of uncertainty, due to limited computation, is entirely ignored when using the approximate posterior.
We develop a new class of methods that provides consistent estimation of the combined uncertainty arising from both the finite number of data observed and the finite amount of computation expended.
arXiv Detail & Related papers (2022-05-30T22:16:25Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - A Kernel-Based Approach for Modelling Gaussian Processes with Functional
Information [0.0]
We use a Gaussian process model to unify the typical finite case with the case of uncountable information.
We discuss this construction in statistical models, including numerical considerations and a proof of concept.
arXiv Detail & Related papers (2022-01-26T15:58:08Z) - An application of the splitting-up method for the computation of a
neural network representation for the solution for the filtering equations [68.8204255655161]
Filtering equations play a central role in many real-life applications, including numerical weather prediction, finance and engineering.
One of the classical approaches to approximate the solution of the filtering equations is to use a PDE inspired method, called the splitting-up method.
We combine this method with a neural network representation to produce an approximation of the unnormalised conditional distribution of the signal process.
arXiv Detail & Related papers (2022-01-10T11:01:36Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.