A Kernel-Based Approach for Modelling Gaussian Processes with Functional
Information
- URL: http://arxiv.org/abs/2201.11023v1
- Date: Wed, 26 Jan 2022 15:58:08 GMT
- Title: A Kernel-Based Approach for Modelling Gaussian Processes with Functional
Information
- Authors: John Nicholson, Peter Kiessler, and D. Andrew Brown
- Abstract summary: We use a Gaussian process model to unify the typical finite case with the case of uncountable information.
We discuss this construction in statistical models, including numerical considerations and a proof of concept.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian processes are among the most useful tools in modeling continuous
processes in machine learning and statistics. If the value of a process is
known at a finite collection of points, one may use Gaussian processes to
construct a surface which interpolates these values to be used for prediction
and uncertainty quantification in other locations. However, it is not always
the case that the available information is in the form of a finite collection
of points. For example, boundary value problems contain information on the
boundary of a domain, which is an uncountable collection of points that cannot
be incorporated into typical Gaussian process techniques. In this paper we
construct a Gaussian process model which utilizes reproducing kernel Hilbert
spaces to unify the typical finite case with the case of having uncountable
information by exploiting the equivalence of conditional expectation and
orthogonal projections. We discuss this construction in statistical models,
including numerical considerations and a proof of concept.
Related papers
- Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Sequential Estimation of Gaussian Process-based Deep State-Space Models [1.760402297380953]
We consider the problem of sequential estimation of the unknowns of state-space and deep state-space models.
We present a method based on particle filtering where the parameters of the random feature-based Gaussian processes are integrated out.
We show that the method can track the latent processes up to a scale and rotation.
arXiv Detail & Related papers (2023-01-29T20:01:09Z) - Isotropic Gaussian Processes on Finite Spaces of Graphs [71.26737403006778]
We propose a principled way to define Gaussian process priors on various sets of unweighted graphs.
We go further to consider sets of equivalence classes of unweighted graphs and define the appropriate versions of priors thereon.
Inspired by applications in chemistry, we illustrate the proposed techniques on a real molecular property prediction task in the small data regime.
arXiv Detail & Related papers (2022-11-03T10:18:17Z) - Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces I: the compact case [43.877478563933316]
In to symmetries is one of the most fundamental forms of prior information one can consider.
In this work, we develop constructive and practical techniques for building stationary Gaussian processes on a very large class of non-Euclidean spaces.
arXiv Detail & Related papers (2022-08-31T16:40:40Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Advanced Stationary and Non-Stationary Kernel Designs for Domain-Aware
Gaussian Processes [0.0]
We propose advanced kernel designs that only allow for functions with certain desirable characteristics to be elements of the reproducing kernel Hilbert space (RKHS)
We will show the impact of advanced kernel designs on Gaussian processes using several synthetic and two scientific data sets.
arXiv Detail & Related papers (2021-02-05T22:07:56Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z) - Convergence Guarantees for Gaussian Process Means With Misspecified
Likelihoods and Smoothness [0.7734726150561089]
We study the properties of Gaussian process means when the smoothness of the model and the likelihood function are misspecified.
The answer to this problem is particularly useful since it can inform our choice of model and experimental design.
arXiv Detail & Related papers (2020-01-29T13:28:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.