Analysis of Regularized Least Squares in Reproducing Kernel Krein Spaces
- URL: http://arxiv.org/abs/2006.01073v2
- Date: Tue, 24 Nov 2020 19:18:06 GMT
- Title: Analysis of Regularized Least Squares in Reproducing Kernel Krein Spaces
- Authors: Fanghui Liu, Lei Shi, Xiaolin Huang, Jie Yang and Johan A.K. Suykens
- Abstract summary: We study the properties of regularized least squares with indefinite kernels in kernel Krein spaces.
We derive learning rates in kernel Krein spaces that are the same as that in kernel Hilbert spaces.
- Score: 35.091152823482105
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we study the asymptotic properties of regularized least
squares with indefinite kernels in reproducing kernel Krein spaces (RKKS). By
introducing a bounded hyper-sphere constraint to such non-convex regularized
risk minimization problem, we theoretically demonstrate that this problem has a
globally optimal solution with a closed form on the sphere, which makes
approximation analysis feasible in RKKS. Regarding to the original regularizer
induced by the indefinite inner product, we modify traditional error
decomposition techniques, prove convergence results for the introduced
hypothesis error based on matrix perturbation theory, and derive learning rates
of such regularized regression problem in RKKS. Under some conditions, the
derived learning rates in RKKS are the same as that in reproducing kernel
Hilbert spaces (RKHS), which is actually the first work on approximation
analysis of regularized learning algorithms in RKKS.
Related papers
- Learning Analysis of Kernel Ridgeless Regression with Asymmetric Kernel Learning [33.34053480377887]
This paper enhances kernel ridgeless regression with Locally-Adaptive-Bandwidths (LAB) RBF kernels.
For the first time, we demonstrate that functions learned from LAB RBF kernels belong to an integral space of Reproducible Kernel Hilbert Spaces (RKHSs)
arXiv Detail & Related papers (2024-06-03T15:28:12Z) - Characterizing Overfitting in Kernel Ridgeless Regression Through the Eigenspectrum [6.749750044497731]
We prove the phenomena of tempered overfitting and catastrophic overfitting under the sub-Gaussian design assumption.
We also identify that the independence of the features plays an important role in guaranteeing tempered overfitting.
arXiv Detail & Related papers (2024-02-02T10:36:53Z) - Weakly Convex Regularisers for Inverse Problems: Convergence of Critical Points and Primal-Dual Optimisation [12.455342327482223]
We present a generalised formulation of convergent regularisation in terms of critical points.
We show that this is achieved by a class of weakly convex regularisers.
Applying this theory to learned regularisation, we prove universal approximation for input weakly convex neural networks.
arXiv Detail & Related papers (2024-02-01T22:54:45Z) - A Theoretical Analysis of the Test Error of Finite-Rank Kernel Ridge
Regression [23.156642467474995]
finite-rank kernels naturally appear in several machine learning problems, e.g. when fine-tuning a pre-trained deep neural network's last layer to adapt it to a novel task.
We address this gap by deriving sharp non-asymptotic upper and lower bounds for the KRR test error of any finite-rank KRR.
Our bounds are tighter than previously derived bounds on finite-rank KRR, and unlike comparable results, they also remain valid for any regularization parameters.
arXiv Detail & Related papers (2023-10-02T08:52:29Z) - Sampling Error Analysis in Quantum Krylov Subspace Diagonalization [1.3108652488669736]
We present a nonasymptotic theoretical framework to assess the relationship between sampling noise and its effects on eigenvalues.
We also propose an optimal solution to cope with large condition numbers by eliminating the ill-conditioned bases.
arXiv Detail & Related papers (2023-07-30T17:22:35Z) - On the Sublinear Regret of GP-UCB [58.25014663727544]
We show that the Gaussian Process Upper Confidence Bound (GP-UCB) algorithm enjoys nearly optimal regret rates.
Our improvements rely on a key technical contribution -- regularizing kernel ridge estimators in proportion to the smoothness of the underlying kernel.
arXiv Detail & Related papers (2023-07-14T13:56:11Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Beyond Tikhonov: Faster Learning with Self-Concordant Losses via
Iterative Regularization [120.31448970413298]
We extend the theory of Tikhonov regularization to generalized self concordant loss functions.
We show that fast and optimal rates can be achieved for GSC by using the iterated Tikhonov regularization scheme.
arXiv Detail & Related papers (2021-06-16T15:25:41Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.