Hida-Mat\'ern Kernel
- URL: http://arxiv.org/abs/2107.07098v1
- Date: Thu, 15 Jul 2021 03:25:10 GMT
- Title: Hida-Mat\'ern Kernel
- Authors: Matthew Dowling, Piotr Sok\'o{\l}, Il Memming Park
- Abstract summary: We present the class of Hida-Mat'ern kernels, which is the canonical family of covariance functions over the entire space of stationary Gauss-Markov Processes.
We show how to represent such processes as state space models using only the kernel and its derivatives.
We also show how exploiting special properties of the state space representation enables improved numerical stability in addition to further reductions of computational complexity.
- Score: 8.594140167290098
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present the class of Hida-Mat\'ern kernels, which is the canonical family
of covariance functions over the entire space of stationary Gauss-Markov
Processes. It extends upon Mat\'ern kernels, by allowing for flexible
construction of priors over processes with oscillatory components. Any
stationary kernel, including the widely used squared-exponential and spectral
mixture kernels, are either directly within this class or are appropriate
asymptotic limits, demonstrating the generality of this class. Taking advantage
of its Markovian nature we show how to represent such processes as state space
models using only the kernel and its derivatives. In turn this allows us to
perform Gaussian Process inference more efficiently and side step the usual
computational burdens. We also show how exploiting special properties of the
state space representation enables improved numerical stability in addition to
further reductions of computational complexity.
Related papers
- Matérn Kernels for Tunable Implicit Surface Reconstruction [5.8691349601057325]
Mat'ern kernels have some appealing properties which make them particularly well suited for surface reconstruction.
Being stationary, we demonstrate that the Mat'ern kernels' spectrum can be tuned in the same fashion as feature mappings.
We analyze Mat'ern's connection to SIREN networks and its relation to previously employed arc-cosine kernels.
arXiv Detail & Related papers (2024-09-23T18:45:42Z) - Reconstructing Kernel-based Machine Learning Force Fields with
Super-linear Convergence [0.18416014644193063]
We consider the broad class of Nystr"om-type methods to construct preconditioners.
All considered methods aim to identify a representative subset of inducing ( Kernel) columns to approximate the dominant kernel spectrum.
arXiv Detail & Related papers (2022-12-24T13:45:50Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Random Gegenbauer Features for Scalable Kernel Methods [11.370390549286757]
We propose efficient random features for approximating a new and rich class of kernel functions that we refer to as Generalized Zonal Kernels (GZK)
Our proposed GZK family generalizes the zonal kernels by introducing factors in their Gegenbauer series expansion.
We show that our proposed features outperform recent kernel approximation methods.
arXiv Detail & Related papers (2022-02-07T19:30:36Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Advanced Stationary and Non-Stationary Kernel Designs for Domain-Aware
Gaussian Processes [0.0]
We propose advanced kernel designs that only allow for functions with certain desirable characteristics to be elements of the reproducing kernel Hilbert space (RKHS)
We will show the impact of advanced kernel designs on Gaussian processes using several synthetic and two scientific data sets.
arXiv Detail & Related papers (2021-02-05T22:07:56Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z) - Sparse Gaussian Processes via Parametric Families of Compactly-supported
Kernels [0.6091702876917279]
We propose a method for deriving parametric families of kernel functions with compact support.
The parameters of this family of kernels can be learned from data using maximum likelihood estimation.
We show that these approximations incur minimal error over the exact models when modeling data drawn directly from a target GP.
arXiv Detail & Related papers (2020-06-05T20:44:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.