An approach to the Gaussian RBF kernels via Fock spaces
- URL: http://arxiv.org/abs/2210.14167v1
- Date: Tue, 25 Oct 2022 16:59:00 GMT
- Title: An approach to the Gaussian RBF kernels via Fock spaces
- Authors: Daniel Alpay, Fabrizio Colombo, Kamal Diki, Irene Sabadini
- Abstract summary: We use methods from the Fock space and Segal-Bargmann theories to prove several results on the Gaussian RBF kernel.
We show how the RBF kernels can be related to some of the most used operators in quantum mechanics and time frequency analysis.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We use methods from the Fock space and Segal-Bargmann theories to prove
several results on the Gaussian RBF kernel in complex analysis. The latter is
one of the most used kernels in modern machine learning kernel methods, and in
support vector machines (SVMs) classification algorithms. Complex analysis
techniques allow us to consider several notions linked to the RBF kernels like
the feature space and the feature map, using the so-called Segal-Bargmann
transform. We show also how the RBF kernels can be related to some of the most
used operators in quantum mechanics and time frequency analysis, specifically,
we prove the connections of such kernels with creation, annihilation, Fourier,
translation, modulation and Weyl operators. For the Weyl operators, we also
study a semigroup property in this case.
Related papers
- Fast Evaluation of Additive Kernels: Feature Arrangement, Fourier Methods, and Kernel Derivatives [0.5735035463793009]
We present a technique based on the non-equispaced fast Fourier transform (NFFT) with rigorous error analysis.
We show that this approach is also well suited to allow the approximation of the matrix that arises when the kernel is differentiated.
We illustrate the performance of the additive kernel scheme with fast matrix vector products on a number of data sets.
arXiv Detail & Related papers (2024-04-26T11:50:16Z) - An appointment with Reproducing Kernel Hilbert Space generated by
Generalized Gaussian RBF as $L^2-$measure [3.9931474959554496]
The Generalized Gaussian Radial Basis Function (RBF) Kernels are the most-often-employed kernels in artificial intelligence and machine learning routines.
This manuscript demonstrates the application of the Generalized Gaussian RBF in the kernel sense on the aforementioned machine learning routines along with the comparisons against the aforementioned functions as well.
arXiv Detail & Related papers (2023-12-17T12:02:10Z) - Higher-order topological kernels via quantum computation [68.8204255655161]
Topological data analysis (TDA) has emerged as a powerful tool for extracting meaningful insights from complex data.
We propose a quantum approach to defining Betti kernels, which is based on constructing Betti curves with increasing order.
arXiv Detail & Related papers (2023-07-14T14:48:52Z) - On the Sublinear Regret of GP-UCB [58.25014663727544]
We show that the Gaussian Process Upper Confidence Bound (GP-UCB) algorithm enjoys nearly optimal regret rates.
Our improvements rely on a key technical contribution -- regularizing kernel ridge estimators in proportion to the smoothness of the underlying kernel.
arXiv Detail & Related papers (2023-07-14T13:56:11Z) - Kernelized Cumulants: Beyond Kernel Mean Embeddings [11.448622437140022]
We extend cumulants to reproducing kernel Hilbert spaces (RKHS) using tools from tensor algebras.
We argue that going beyond degree one has several advantages and can be achieved with the same computational complexity and minimal overhead.
arXiv Detail & Related papers (2023-01-29T15:31:06Z) - Learning "best" kernels from data in Gaussian process regression. With
application to aerodynamics [0.4588028371034406]
We introduce algorithms to select/design kernels in Gaussian process regression/kriging surrogate modeling techniques.
A first class of algorithms is kernel flow, which was introduced in a context of classification in machine learning.
A second class of algorithms is called spectral kernel ridge regression, and aims at selecting a "best" kernel such that the norm of the function to be approximated is minimal.
arXiv Detail & Related papers (2022-06-03T07:50:54Z) - Revisiting Memory Efficient Kernel Approximation: An Indefinite Learning
Perspective [0.8594140167290097]
Matrix approximations are a key element in large-scale machine learning approaches.
We extend MEKA to be applicable not only for shift-invariant kernels but also for non-stationary kernels.
We present a Lanczos-based estimation of a spectrum shift to develop a stable positive semi-definite MEKA approximation.
arXiv Detail & Related papers (2021-12-18T10:01:34Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z) - Exact representations of many body interactions with RBM neural networks [77.34726150561087]
We exploit the representation power of RBMs to provide an exact decomposition of many-body contact interactions into one-body operators.
This construction generalizes the well known Hirsch's transform used for the Hubbard model to more complicated theories such as Pionless EFT in nuclear physics.
arXiv Detail & Related papers (2020-05-07T15:59:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.