Unified Native Spaces in Kernel Methods
- URL: http://arxiv.org/abs/2501.01825v1
- Date: Fri, 03 Jan 2025 14:17:41 GMT
- Title: Unified Native Spaces in Kernel Methods
- Authors: Xavier Emery, Emilio Porcu, Moreno Bevilacqua,
- Abstract summary: This paper unifies a wealth of well-known kernels into a single class that encompasses them as special cases.
As a by-product, we infer the Sobolev spaces that are associated with existing classes of kernels.
- Score: 0.0
- License:
- Abstract: There exists a plethora of parametric models for positive definite kernels, and their use is ubiquitous in disciplines as diverse as statistics, machine learning, numerical analysis, and approximation theory. Usually, the kernel parameters index certain features of an associated process. Amongst those features, smoothness (in the sense of Sobolev spaces, mean square differentiability, and fractal dimensions), compact or global supports, and negative dependencies (hole effects) are of interest to several theoretical and applied disciplines. This paper unifies a wealth of well-known kernels into a single parametric class that encompasses them as special cases, attained either by exact parameterization or through parametric asymptotics. We furthermore characterize the Sobolev space that is norm equivalent to the RKHS associated with the new kernel. As a by-product, we infer the Sobolev spaces that are associated with existing classes of kernels. We illustrate the main properties of the new class, show how this class can switch from compact to global supports, and provide special cases for which the kernel attains negative values over nontrivial intervals. Hence, the proposed class of kernel is the reproducing kernel of a very rich Hilbert space that contains many special cases, including the celebrated Mat\'ern and Wendland kernels, as well as their aliases with hole effects.
Related papers
- On the Approximation of Kernel functions [0.0]
The paper addresses approximations of the kernel itself.
For the Hilbert Gauss kernel on the unit cube, the paper establishes an upper bound of the associated eigenfunctions.
This improvement confirms low rank approximation methods such as the Nystr"om method.
arXiv Detail & Related papers (2024-03-11T13:50:07Z) - Kernelized Cumulants: Beyond Kernel Mean Embeddings [11.448622437140022]
We extend cumulants to reproducing kernel Hilbert spaces (RKHS) using tools from tensor algebras.
We argue that going beyond degree one has several advantages and can be achieved with the same computational complexity and minimal overhead.
arXiv Detail & Related papers (2023-01-29T15:31:06Z) - Interpolation with the polynomial kernels [5.8720142291102135]
kernels are widely used in machine learning and they are one of the default choices to develop kernel-based regression models.
They are rarely used and considered in numerical analysis due to their lack of strict positive definiteness.
This paper is devoted to establish some initial results for the study of these kernels, and their related algorithms.
arXiv Detail & Related papers (2022-12-15T08:30:23Z) - Local Sample-weighted Multiple Kernel Clustering with Consensus
Discriminative Graph [73.68184322526338]
Multiple kernel clustering (MKC) is committed to achieving optimal information fusion from a set of base kernels.
This paper proposes a novel local sample-weighted multiple kernel clustering model.
Experimental results demonstrate that our LSWMKC possesses better local manifold representation and outperforms existing kernel or graph-based clustering algo-rithms.
arXiv Detail & Related papers (2022-07-05T05:00:38Z) - Geometry-aware Bayesian Optimization in Robotics using Riemannian
Mat\'ern Kernels [64.62221198500467]
We show how to implement geometry-aware kernels for Bayesian optimization.
This technique can be used for control parameter tuning, parametric policy adaptation, and structure design in robotics.
arXiv Detail & Related papers (2021-11-02T09:47:22Z) - Hida-Mat\'ern Kernel [8.594140167290098]
We present the class of Hida-Mat'ern kernels, which is the canonical family of covariance functions over the entire space of stationary Gauss-Markov Processes.
We show how to represent such processes as state space models using only the kernel and its derivatives.
We also show how exploiting special properties of the state space representation enables improved numerical stability in addition to further reductions of computational complexity.
arXiv Detail & Related papers (2021-07-15T03:25:10Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Taming Nonconvexity in Kernel Feature Selection---Favorable Properties
of the Laplace Kernel [77.73399781313893]
A challenge is to establish the objective function of kernel-based feature selection.
The gradient-based algorithms available for non-global optimization are only able to guarantee convergence to local minima.
arXiv Detail & Related papers (2021-06-17T11:05:48Z) - Reproducing Kernel Hilbert Space, Mercer's Theorem, Eigenfunctions,
Nystr\"om Method, and Use of Kernels in Machine Learning: Tutorial and Survey [5.967999555890417]
We start with reviewing the history of kernels in functional analysis and machine learning.
We introduce types of use of kernels in machine learning including kernel methods, kernel learning by semi-definite programming, Hilbert-Schmidt independence criterion, maximum mean discrepancy, kernel mean embedding, and kernel dimensionality reduction.
This paper can be useful for various fields of science including machine learning, dimensionality reduction, functional analysis in mathematics, and mathematical physics in quantum mechanics.
arXiv Detail & Related papers (2021-06-15T21:29:12Z) - The role of feature space in atomistic learning [62.997667081978825]
Physically-inspired descriptors play a key role in the application of machine-learning techniques to atomistic simulations.
We introduce a framework to compare different sets of descriptors, and different ways of transforming them by means of metrics and kernels.
We compare representations built in terms of n-body correlations of the atom density, quantitatively assessing the information loss associated with the use of low-order features.
arXiv Detail & Related papers (2020-09-06T14:12:09Z) - Learning Deep Kernels for Non-Parametric Two-Sample Tests [50.92621794426821]
We propose a class of kernel-based two-sample tests, which aim to determine whether two sets of samples are drawn from the same distribution.
Our tests are constructed from kernels parameterized by deep neural nets, trained to maximize test power.
arXiv Detail & Related papers (2020-02-21T03:54:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.