Absolute integrability of Mercer kernels is only sufficient for RKHS
stability
- URL: http://arxiv.org/abs/2305.01411v1
- Date: Tue, 2 May 2023 13:35:48 GMT
- Title: Absolute integrability of Mercer kernels is only sufficient for RKHS
stability
- Authors: Mauro Bisiacco and Gianluigi Pillonetto
- Abstract summary: Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces in one-to-one correspondence with positive definite maps called kernels.
Stable RKHSs contain only absolutely integrable impulse responses over the positive real line.
- Score: 1.52292571922932
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces in
one-to-one correspondence with positive definite maps called kernels. They are
widely employed in machine learning to reconstruct unknown functions from
sparse and noisy data. In the last two decades, a subclass known as stable
RKHSs has been also introduced in the setting of linear system identification.
Stable RKHSs contain only absolutely integrable impulse responses over the
positive real line. Hence, they can be adopted as hypothesis spaces to estimate
linear, time-invariant and BIBO stable dynamic systems from input-output data.
Necessary and sufficient conditions for RKHS stability are available in the
literature and it is known that kernel absolute integrability implies
stability. Working in discrete-time, in a recent work we have proved that this
latter condition is only sufficient. Working in continuous-time, it is the
purpose of this note to prove that the same result holds also for Mercer
kernels.
Related papers
- Convergence Conditions of Online Regularized Statistical Learning in Reproducing Kernel Hilbert Space With Non-Stationary Data [4.5692679976952215]
We study the convergence of regularized learning algorithms in the reproducing kernel HilbertRKHS with dependent and non-stationary online data streams.
For independent and non-identically distributed data streams, the algorithm achieves the mean square consistency.
arXiv Detail & Related papers (2024-04-04T05:35:59Z) - Lipschitz and H\"older Continuity in Reproducing Kernel Hilbert Spaces [1.3053649021965603]
Lipschitz and H"older continuity are important regularity properties, with many applications in machine learning, statistics, numerical analysis and pure mathematics.
We provide several sufficient conditions as well as an in depth investigation of reproducing kernels inducing prescribed Lipschitz or H"older continuity.
arXiv Detail & Related papers (2023-10-27T11:56:43Z) - On the stability test for reproducing kernel Hilbert spaces [1.52292571922932]
Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces where all the evaluation functionals are linear and bounded.
We show that the stability test can be reduced to the study of the kernel operator over test functions which assume only the values $pm 1$.
arXiv Detail & Related papers (2023-05-01T14:40:23Z) - KCRL: Krasovskii-Constrained Reinforcement Learning with Guaranteed
Stability in Nonlinear Dynamical Systems [66.9461097311667]
We propose a model-based reinforcement learning framework with formal stability guarantees.
The proposed method learns the system dynamics up to a confidence interval using feature representation.
We show that KCRL is guaranteed to learn a stabilizing policy in a finite number of interactions with the underlying unknown system.
arXiv Detail & Related papers (2022-06-03T17:27:04Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Meta-Learning Hypothesis Spaces for Sequential Decision-making [79.73213540203389]
We propose to meta-learn a kernel from offline data (Meta-KeL)
Under mild conditions, we guarantee that our estimated RKHS yields valid confidence sets.
We also empirically evaluate the effectiveness of our approach on a Bayesian optimization task.
arXiv Detail & Related papers (2022-02-01T17:46:51Z) - Open Problem: Tight Online Confidence Intervals for RKHS Elements [57.363123214464764]
We formalize the question of online confidence intervals in the RKHS setting and overview the existing results.
It is unclear whether the suboptimal regret bound is a fundamental shortcoming of these algorithms or an artifact of the proof.
arXiv Detail & Related papers (2021-10-28T22:36:20Z) - Strong Uniform Consistency with Rates for Kernel Density Estimators with
General Kernels on Manifolds [11.927892660941643]
We show how to handle kernel density estimation with intricate kernels not designed by the user.
The isotropic kernels considered in this paper are different from the kernels in the Vapnik-Chervonenkis class that are frequently considered in statistics society.
arXiv Detail & Related papers (2020-07-13T14:36:06Z) - Metrizing Weak Convergence with Maximum Mean Discrepancies [88.54422104669078]
This paper characterizes the maximum mean discrepancies (MMD) that metrize the weak convergence of probability measures for a wide class of kernels.
We prove that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel k, metrizes the weak convergence of probability measures if and only if k is continuous.
arXiv Detail & Related papers (2020-06-16T15:49:33Z) - Fine-Grained Analysis of Stability and Generalization for Stochastic
Gradient Descent [55.85456985750134]
We introduce a new stability measure called on-average model stability, for which we develop novel bounds controlled by the risks of SGD iterates.
This yields generalization bounds depending on the behavior of the best model, and leads to the first-ever-known fast bounds in the low-noise setting.
To our best knowledge, this gives the firstever-known stability and generalization for SGD with even non-differentiable loss functions.
arXiv Detail & Related papers (2020-06-15T06:30:19Z) - Mathematical foundations of stable RKHSs [1.52292571922932]
Reproducing kernel Hilbert spaces (RKHSs) are key spaces for machine learning that are becoming popular also for linear system identification.
In this paper we provide new structural properties of stable RKHSs.
arXiv Detail & Related papers (2020-05-06T17:25:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.