Lipschitz and H\"older Continuity in Reproducing Kernel Hilbert Spaces
- URL: http://arxiv.org/abs/2310.18078v1
- Date: Fri, 27 Oct 2023 11:56:43 GMT
- Title: Lipschitz and H\"older Continuity in Reproducing Kernel Hilbert Spaces
- Authors: Christian Fiedler
- Abstract summary: Lipschitz and H"older continuity are important regularity properties, with many applications in machine learning, statistics, numerical analysis and pure mathematics.
We provide several sufficient conditions as well as an in depth investigation of reproducing kernels inducing prescribed Lipschitz or H"older continuity.
- Score: 1.3053649021965603
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Reproducing kernel Hilbert spaces (RKHSs) are very important function spaces,
playing an important role in machine learning, statistics, numerical analysis
and pure mathematics. Since Lipschitz and H\"older continuity are important
regularity properties, with many applications in interpolation, approximation
and optimization problems, in this work we investigate these continuity notion
in RKHSs. We provide several sufficient conditions as well as an in depth
investigation of reproducing kernels inducing prescribed Lipschitz or H\"older
continuity. Apart from new results, we also collect related known results from
the literature, making the present work also a convenient reference on this
topic.
Related papers
- On the Sublinear Regret of GP-UCB [58.25014663727544]
We show that the Gaussian Process Upper Confidence Bound (GP-UCB) algorithm enjoys nearly optimal regret rates.
Our improvements rely on a key technical contribution -- regularizing kernel ridge estimators in proportion to the smoothness of the underlying kernel.
arXiv Detail & Related papers (2023-07-14T13:56:11Z) - Absolute integrability of Mercer kernels is only sufficient for RKHS
stability [1.52292571922932]
Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces in one-to-one correspondence with positive definite maps called kernels.
Stable RKHSs contain only absolutely integrable impulse responses over the positive real line.
arXiv Detail & Related papers (2023-05-02T13:35:48Z) - On the stability test for reproducing kernel Hilbert spaces [1.52292571922932]
Reproducing kernel Hilbert spaces (RKHSs) are special Hilbert spaces where all the evaluation functionals are linear and bounded.
We show that the stability test can be reduced to the study of the kernel operator over test functions which assume only the values $pm 1$.
arXiv Detail & Related papers (2023-05-01T14:40:23Z) - Kernelized Cumulants: Beyond Kernel Mean Embeddings [11.448622437140022]
We extend cumulants to reproducing kernel Hilbert spaces (RKHS) using tools from tensor algebras.
We argue that going beyond degree one has several advantages and can be achieved with the same computational complexity and minimal overhead.
arXiv Detail & Related papers (2023-01-29T15:31:06Z) - Continuous percolation in a Hilbert space for a large system of qubits [58.720142291102135]
The percolation transition is defined through the appearance of the infinite cluster.
We show that the exponentially increasing dimensionality of the Hilbert space makes its covering by finite-size hyperspheres inefficient.
Our approach to the percolation transition in compact metric spaces may prove useful for its rigorous treatment in other contexts.
arXiv Detail & Related papers (2022-10-15T13:53:21Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Meta-Learning Hypothesis Spaces for Sequential Decision-making [79.73213540203389]
We propose to meta-learn a kernel from offline data (Meta-KeL)
Under mild conditions, we guarantee that our estimated RKHS yields valid confidence sets.
We also empirically evaluate the effectiveness of our approach on a Bayesian optimization task.
arXiv Detail & Related papers (2022-02-01T17:46:51Z) - Open Problem: Tight Online Confidence Intervals for RKHS Elements [57.363123214464764]
We formalize the question of online confidence intervals in the RKHS setting and overview the existing results.
It is unclear whether the suboptimal regret bound is a fundamental shortcoming of these algorithms or an artifact of the proof.
arXiv Detail & Related papers (2021-10-28T22:36:20Z) - Hida-Mat\'ern Kernel [8.594140167290098]
We present the class of Hida-Mat'ern kernels, which is the canonical family of covariance functions over the entire space of stationary Gauss-Markov Processes.
We show how to represent such processes as state space models using only the kernel and its derivatives.
We also show how exploiting special properties of the state space representation enables improved numerical stability in addition to further reductions of computational complexity.
arXiv Detail & Related papers (2021-07-15T03:25:10Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - Analysis via Orthonormal Systems in Reproducing Kernel Hilbert
$C^*$-Modules and Applications [12.117553807794382]
We propose a novel data analysis framework with reproducing kernel Hilbert $C*$-module (RKHM)
We show the theoretical validity for the construction of orthonormal systems in Hilbert $C*$-modules, and derive concrete procedures for orthonormalization in RKHMs.
We apply those to generalize with RKHM kernel principal component analysis and the analysis of dynamical systems with Perron-Frobenius operators.
arXiv Detail & Related papers (2020-03-02T10:01:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.