On the stability test for reproducing kernel Hilbert spaces
- URL: http://arxiv.org/abs/2305.02213v1
- Date: Mon, 1 May 2023 14:40:23 GMT
- Title: On the stability test for reproducing kernel Hilbert spaces
- Authors: Mauro Bisiacco and Gianluigi Pillonetto
- Abstract summary: Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces where all the evaluation functionals are linear and bounded.
We show that the stability test can be reduced to the study of the kernel operator over test functions which assume only the values $pm 1$.
- Score: 1.52292571922932
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces where
all the evaluation functionals are linear and bounded. They are in one-to-one
correspondence with positive definite maps called kernels. Stable RKHSs enjoy
the additional property of containing only functions and absolutely integrable.
Necessary and sufficient conditions for RKHS stability are known in the
literature: the integral operator induced by the kernel must be bounded as map
between $\mathcal{L}_{\infty}$, the space of essentially bounded (test)
functions, and $\mathcal{L}_1$, the space of absolutely integrable functions.
Considering Mercer (continuous) kernels in continuous-time and the entire
discrete-time class, we show that the stability test can be reduced to the
study of the kernel operator over test functions which assume (almost
everywhere) only the values $\pm 1$. They represent the same functions needed
to investigate stability of any single element in the RKHS. In this way, the
RKHS stability test becomes an elegant generalization of a straightforward
result concerning Bounded-Input Bounded-Output (BIBO) stability of a single
linear time-invariant system.
Related papers
- Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Hilbert's projective metric for functions of bounded growth and
exponential convergence of Sinkhorn's algorithm [1.6317061277457001]
We study versions of Hilbert's projective metric for spaces of integrable functions of bounded growth.
We show that kernel integral operators are contractions with respect to suitable specifications of such metrics.
As an application to entropic optimal transport, we show exponential convergence of Sinkhorn's algorithm in settings where the marginal distributions have sufficiently light tails.
arXiv Detail & Related papers (2023-11-07T14:53:23Z) - Lipschitz and H\"older Continuity in Reproducing Kernel Hilbert Spaces [1.3053649021965603]
Lipschitz and H"older continuity are important regularity properties, with many applications in machine learning, statistics, numerical analysis and pure mathematics.
We provide several sufficient conditions as well as an in depth investigation of reproducing kernels inducing prescribed Lipschitz or H"older continuity.
arXiv Detail & Related papers (2023-10-27T11:56:43Z) - Absolute integrability of Mercer kernels is only sufficient for RKHS
stability [1.52292571922932]
Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces in one-to-one correspondence with positive definite maps called kernels.
Stable RKHSs contain only absolutely integrable impulse responses over the positive real line.
arXiv Detail & Related papers (2023-05-02T13:35:48Z) - Recursive Estimation of Conditional Kernel Mean Embeddings [0.0]
Kernel mean embeddings map probability distributions to elements of a kernel reproducing Hilbert space (RKHS)
We present a new algorithm to estimate the conditional kernel mean map in a Hilbert space valued $L$ space, that is in a Bochner space.
arXiv Detail & Related papers (2023-02-12T16:55:58Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Open Problem: Tight Online Confidence Intervals for RKHS Elements [57.363123214464764]
We formalize the question of online confidence intervals in the RKHS setting and overview the existing results.
It is unclear whether the suboptimal regret bound is a fundamental shortcoming of these algorithms or an artifact of the proof.
arXiv Detail & Related papers (2021-10-28T22:36:20Z) - Optimal policy evaluation using kernel-based temporal difference methods [78.83926562536791]
We use kernel Hilbert spaces for estimating the value function of an infinite-horizon discounted Markov reward process.
We derive a non-asymptotic upper bound on the error with explicit dependence on the eigenvalues of the associated kernel operator.
We prove minimax lower bounds over sub-classes of MRPs.
arXiv Detail & Related papers (2021-09-24T14:48:20Z) - Metrizing Weak Convergence with Maximum Mean Discrepancies [88.54422104669078]
This paper characterizes the maximum mean discrepancies (MMD) that metrize the weak convergence of probability measures for a wide class of kernels.
We prove that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel k, metrizes the weak convergence of probability measures if and only if k is continuous.
arXiv Detail & Related papers (2020-06-16T15:49:33Z) - Fine-Grained Analysis of Stability and Generalization for Stochastic
Gradient Descent [55.85456985750134]
We introduce a new stability measure called on-average model stability, for which we develop novel bounds controlled by the risks of SGD iterates.
This yields generalization bounds depending on the behavior of the best model, and leads to the first-ever-known fast bounds in the low-noise setting.
To our best knowledge, this gives the firstever-known stability and generalization for SGD with even non-differentiable loss functions.
arXiv Detail & Related papers (2020-06-15T06:30:19Z) - Mathematical foundations of stable RKHSs [1.52292571922932]
Reproducing kernel Hilbert spaces (RKHSs) are key spaces for machine learning that are becoming popular also for linear system identification.
In this paper we provide new structural properties of stable RKHSs.
arXiv Detail & Related papers (2020-05-06T17:25:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.