Upper Bounds for Learning in Reproducing Kernel Hilbert Spaces for Non IID Samples
- URL: http://arxiv.org/abs/2410.08361v2
- Date: Thu, 02 Jan 2025 15:47:34 GMT
- Title: Upper Bounds for Learning in Reproducing Kernel Hilbert Spaces for Non IID Samples
- Authors: Priyanka Roy, Susanne Saminger-Platz,
- Abstract summary: We study a Markov chain-based gradient algorithm in general Hilbert spaces, aiming to approximate the optimal solution of a quadratic loss function.
We extend these results to an online regularized learning algorithm in reproducing kernel Hilbert spaces.
- Score: 1.1510009152620668
- License:
- Abstract: In this paper, we study a Markov chain-based stochastic gradient algorithm in general Hilbert spaces, aiming to approximate the optimal solution of a quadratic loss function. We establish probabilistic upper bounds on its convergence. We further extend these results to an online regularized learning algorithm in reproducing kernel Hilbert spaces, where the samples are drawn along a Markov chain trajectory hence the samples are of the non i.i.d. type.
Related papers
- Mirror Descent on Reproducing Kernel Banach Spaces [12.716091600034543]
This paper addresses a learning problem on Banach spaces endowed with a reproducing kernel.
We propose an algorithm that employs gradient steps in the dual space of the Banach space using the reproducing kernel.
To instantiate this algorithm in practice, we introduce a novel family of RKBSs with $p$-norm.
arXiv Detail & Related papers (2024-11-18T02:18:32Z) - Semi-Supervised Laplace Learning on Stiefel Manifolds [48.3427853588646]
We develop the framework Sequential Subspace for graph-based, supervised samples at low-label rates.
We achieves that our methods at extremely low rates, and high label rates.
arXiv Detail & Related papers (2023-07-31T20:19:36Z) - Decentralized Online Learning for Random Inverse Problems Over Graphs [6.423798607256407]
We develop the convergence of the stability of the algorithm in Hilbert spaces with $_$-bounded martingale difference terms.
We show that if the network graph is connected and the sequence of forward operators satisfies the infinite-dimensional-temporal persistence of excitation condition, then the estimates of all nodes are mean square.
We propose a decentralized online learning algorithm in RKHS based on non-stationary online data.
arXiv Detail & Related papers (2023-03-20T08:37:08Z) - On the Benefits of Large Learning Rates for Kernel Methods [110.03020563291788]
We show that a phenomenon can be precisely characterized in the context of kernel methods.
We consider the minimization of a quadratic objective in a separable Hilbert space, and show that with early stopping, the choice of learning rate influences the spectral decomposition of the obtained solution.
arXiv Detail & Related papers (2022-02-28T13:01:04Z) - Polynomial convergence of iterations of certain random operators in
Hilbert space [2.732936573198251]
We study the convergence of random iterative sequence of a family of operators on infinite dimensional spaces inspired bySGD algorithm.
We demonstrate that its convergence rate depends on the initial state, while randomness plays a role only in the choice of the best constant factor.
arXiv Detail & Related papers (2022-02-04T17:48:29Z) - Optimal policy evaluation using kernel-based temporal difference methods [78.83926562536791]
We use kernel Hilbert spaces for estimating the value function of an infinite-horizon discounted Markov reward process.
We derive a non-asymptotic upper bound on the error with explicit dependence on the eigenvalues of the associated kernel operator.
We prove minimax lower bounds over sub-classes of MRPs.
arXiv Detail & Related papers (2021-09-24T14:48:20Z) - High-probability Bounds for Non-Convex Stochastic Optimization with
Heavy Tails [55.561406656549686]
We consider non- Hilbert optimization using first-order algorithms for which the gradient estimates may have tails.
We show that a combination of gradient, momentum, and normalized gradient descent convergence to critical points in high-probability with best-known iteration for smooth losses.
arXiv Detail & Related papers (2021-06-28T00:17:01Z) - Spectral clustering under degree heterogeneity: a case for the random
walk Laplacian [83.79286663107845]
This paper shows that graph spectral embedding using the random walk Laplacian produces vector representations which are completely corrected for node degree.
In the special case of a degree-corrected block model, the embedding concentrates about K distinct points, representing communities.
arXiv Detail & Related papers (2021-05-03T16:36:27Z) - Faster Convergence of Stochastic Gradient Langevin Dynamics for
Non-Log-Concave Sampling [110.88857917726276]
We provide a new convergence analysis of gradient Langevin dynamics (SGLD) for sampling from a class of distributions that can be non-log-concave.
At the core of our approach is a novel conductance analysis of SGLD using an auxiliary time-reversible Markov Chain.
arXiv Detail & Related papers (2020-10-19T15:23:18Z) - Early stopping and polynomial smoothing in regression with reproducing kernels [2.0411082897313984]
We study the problem of early stopping for iterative learning algorithms in a reproducing kernel Hilbert space (RKHS)
We present a data-driven rule to perform early stopping without a validation set that is based on the so-called minimum discrepancy principle.
The proposed rule is proved to be minimax-optimal over different types of kernel spaces.
arXiv Detail & Related papers (2020-07-14T05:27:18Z) - Stochastic Gradient Descent in Hilbert Scales: Smoothness,
Preconditioning and Earlier Stopping [0.0]
We consider least squares learning in reproducing kernel Hilbert spaces (RKHSs) and extend the classical SGD analysis to a learning setting in Hilbert scales.
We show that even for well-specified models, violation of a traditional benchmark smoothness assumption has a tremendous effect on the learning rate.
In addition, we show that for miss-specified models, preconditioning in an appropriate Hilbert scale helps to reduce the number of iterations.
arXiv Detail & Related papers (2020-06-18T20:22:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.