Advanced Stationary and Non-Stationary Kernel Designs for Domain-Aware
Gaussian Processes
- URL: http://arxiv.org/abs/2102.03432v1
- Date: Fri, 5 Feb 2021 22:07:56 GMT
- Title: Advanced Stationary and Non-Stationary Kernel Designs for Domain-Aware
Gaussian Processes
- Authors: Marcus M. Noack and James A. Sethian
- Abstract summary: We propose advanced kernel designs that only allow for functions with certain desirable characteristics to be elements of the reproducing kernel Hilbert space (RKHS)
We will show the impact of advanced kernel designs on Gaussian processes using several synthetic and two scientific data sets.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian process regression is a widely-applied method for function
approximation and uncertainty quantification. The technique has gained
popularity recently in the machine learning community due to its robustness and
interpretability. The mathematical methods we discuss in this paper are an
extension of the Gaussian-process framework. We are proposing advanced kernel
designs that only allow for functions with certain desirable characteristics to
be elements of the reproducing kernel Hilbert space (RKHS) that underlies all
kernel methods and serves as the sample space for Gaussian process regression.
These desirable characteristics reflect the underlying physics; two obvious
examples are symmetry and periodicity constraints. In addition, non-stationary
kernel designs can be defined in the same framework to yield flexible
multi-task Gaussian processes. We will show the impact of advanced kernel
designs on Gaussian processes using several synthetic and two scientific data
sets. The results show that including domain knowledge, communicated through
advanced kernel designs, has a significant impact on the accuracy and relevance
of the function approximation.
Related papers
- Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces II: non-compact symmetric spaces [43.877478563933316]
In to symmetries is one of the most fundamental forms of prior information one can consider.
In this work, we develop constructive and practical techniques for building stationary Gaussian processes on a very large class of non-Euclidean spaces.
arXiv Detail & Related papers (2023-01-30T17:27:12Z) - Stationary Kernels and Gaussian Processes on Lie Groups and their Homogeneous Spaces I: the compact case [43.877478563933316]
In to symmetries is one of the most fundamental forms of prior information one can consider.
In this work, we develop constructive and practical techniques for building stationary Gaussian processes on a very large class of non-Euclidean spaces.
arXiv Detail & Related papers (2022-08-31T16:40:40Z) - Learning "best" kernels from data in Gaussian process regression. With
application to aerodynamics [0.4588028371034406]
We introduce algorithms to select/design kernels in Gaussian process regression/kriging surrogate modeling techniques.
A first class of algorithms is kernel flow, which was introduced in a context of classification in machine learning.
A second class of algorithms is called spectral kernel ridge regression, and aims at selecting a "best" kernel such that the norm of the function to be approximated is minimal.
arXiv Detail & Related papers (2022-06-03T07:50:54Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Improved Random Features for Dot Product Kernels [12.321353062415701]
We make several novel contributions for improving the efficiency of random feature approximations for dot product kernels.
We show empirically that the use of complex features can significantly reduce the variances of these approximations.
We develop a data-driven optimization approach to improve random feature approximations for general dot product kernels.
arXiv Detail & Related papers (2022-01-21T14:16:56Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.