Orthonormal Expansions for Translation-Invariant Kernels
- URL: http://arxiv.org/abs/2206.08648v4
- Date: Tue, 30 Apr 2024 13:55:10 GMT
- Title: Orthonormal Expansions for Translation-Invariant Kernels
- Authors: Filip Tronarp, Toni Karvonen,
- Abstract summary: We derive explicit expansions on the real line for (i) Mat'ern kernels of all half-integer orders in terms of associated Laguerre functions, (ii) the Cauchy kernel in terms of rational functions, and (iii) the Gaussian kernel in terms of Hermite functions.
- Score: 8.646318875448644
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a general Fourier analytic technique for constructing orthonormal basis expansions of translation-invariant kernels from orthonormal bases of $\mathscr{L}_2(\mathbb{R})$. This allows us to derive explicit expansions on the real line for (i) Mat\'ern kernels of all half-integer orders in terms of associated Laguerre functions, (ii) the Cauchy kernel in terms of rational functions, and (iii) the Gaussian kernel in terms of Hermite functions.
Related papers
- Gaussian kernel expansion with basis functions uniformly bounded in $\mathcal{L}_{\infty}$ [0.6138671548064355]
Kernel expansions are a topic of considerable interest in machine learning.
Recent work in the literature has derived some of these results by assuming uniformly bounded basis functions in $mathcalL_infty$.
Our main result is the construction on $mathbbR2$ of a Gaussian kernel expansion with weights in $ell_p$ for any $p>1$.
arXiv Detail & Related papers (2024-10-02T10:10:30Z) - High-Dimensional Kernel Methods under Covariate Shift: Data-Dependent Implicit Regularization [83.06112052443233]
This paper studies kernel ridge regression in high dimensions under covariate shifts.
By a bias-variance decomposition, we theoretically demonstrate that the re-weighting strategy allows for decreasing the variance.
For bias, we analyze the regularization of the arbitrary or well-chosen scale, showing that the bias can behave very differently under different regularization scales.
arXiv Detail & Related papers (2024-06-05T12:03:27Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Random Gegenbauer Features for Scalable Kernel Methods [11.370390549286757]
We propose efficient random features for approximating a new and rich class of kernel functions that we refer to as Generalized Zonal Kernels (GZK)
Our proposed GZK family generalizes the zonal kernels by introducing factors in their Gegenbauer series expansion.
We show that our proposed features outperform recent kernel approximation methods.
arXiv Detail & Related papers (2022-02-07T19:30:36Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Generalization Properties of Stochastic Optimizers via Trajectory
Analysis [48.38493838310503]
We show that both the Fernique-Talagrand functional and the local powerlaw are predictive of generalization performance.
We show that both our Fernique-Talagrand functional and the local powerlaw are predictive of generalization performance.
arXiv Detail & Related papers (2021-08-02T10:58:32Z) - Hida-Mat\'ern Kernel [8.594140167290098]
We present the class of Hida-Mat'ern kernels, which is the canonical family of covariance functions over the entire space of stationary Gauss-Markov Processes.
We show how to represent such processes as state space models using only the kernel and its derivatives.
We also show how exploiting special properties of the state space representation enables improved numerical stability in addition to further reductions of computational complexity.
arXiv Detail & Related papers (2021-07-15T03:25:10Z) - A probabilistic Taylor expansion with Gaussian processes [8.840147522046651]
We study a class of Gaussian processes for which the posterior mean, for a particular choice of data, replicates a truncated Taylor expansion of any order.
We discuss and prove some results on maximum likelihood estimation of parameters of Taylor kernels.
arXiv Detail & Related papers (2021-02-01T14:36:34Z) - Curvature-Dependant Global Convergence Rates for Optimization on
Manifolds of Bounded Geometry [6.85316573653194]
We give curvature-dependant convergence rates for weakly convex functions defined on a manifold of 1-bounded geometry.
We compute these bounds explicitly for some manifold commonly used in the optimization literature.
We present self-contained proofs of fully general bounds on the norm of the differential of the exponential map.
arXiv Detail & Related papers (2020-08-06T08:30:35Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.