Analysis via Orthonormal Systems in Reproducing Kernel Hilbert
$C^*$-Modules and Applications
- URL: http://arxiv.org/abs/2003.00738v1
- Date: Mon, 2 Mar 2020 10:01:14 GMT
- Title: Analysis via Orthonormal Systems in Reproducing Kernel Hilbert
$C^*$-Modules and Applications
- Authors: Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi
Katsura, Yoshinobu Kawahara
- Abstract summary: We propose a novel data analysis framework with reproducing kernel Hilbert $C*$-module (RKHM)
We show the theoretical validity for the construction of orthonormal systems in Hilbert $C*$-modules, and derive concrete procedures for orthonormalization in RKHMs.
We apply those to generalize with RKHM kernel principal component analysis and the analysis of dynamical systems with Perron-Frobenius operators.
- Score: 12.117553807794382
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Kernel methods have been among the most popular techniques in machine
learning, where learning tasks are solved using the property of reproducing
kernel Hilbert space (RKHS). In this paper, we propose a novel data analysis
framework with reproducing kernel Hilbert $C^*$-module (RKHM), which is another
generalization of RKHS than vector-valued RKHS (vv-RKHS). Analysis with RKHMs
enables us to deal with structures among variables more explicitly than
vv-RKHS. We show the theoretical validity for the construction of orthonormal
systems in Hilbert $C^*$-modules, and derive concrete procedures for
orthonormalization in RKHMs with those theoretical properties in numerical
computations. Moreover, we apply those to generalize with RKHM kernel principal
component analysis and the analysis of dynamical systems with Perron-Frobenius
operators. The empirical performance of our methods is also investigated by
using synthetic and real-world data.
Related papers
- Convolutional Filtering with RKHS Algebras [110.06688302593349]
We develop a theory of convolutional signal processing and neural networks for Reproducing Kernel Hilbert Spaces (RKHS)
We show that any RKHS allows the formal definition of multiple algebraic convolutional models.
We present a set of numerical experiments on real data in which wireless coverage is predicted from measurements captured by unmaned aerial vehicles.
arXiv Detail & Related papers (2024-11-02T18:53:44Z) - Learning Analysis of Kernel Ridgeless Regression with Asymmetric Kernel Learning [33.34053480377887]
This paper enhances kernel ridgeless regression with Locally-Adaptive-Bandwidths (LAB) RBF kernels.
For the first time, we demonstrate that functions learned from LAB RBF kernels belong to an integral space of Reproducible Kernel Hilbert Spaces (RKHSs)
arXiv Detail & Related papers (2024-06-03T15:28:12Z) - Deep Learning with Kernels through RKHM and the Perron-Frobenius
Operator [14.877070496733966]
Reproducing kernel Hilbert $C*$-module (RKHM) is a generalization of reproducing kernel Hilbert space (RKHS) by means of $C*$-algebra.
We derive a new Rademacher generalization bound in this setting and provide a theoretical interpretation of benign overfitting by means of Perron-Frobenius operators.
arXiv Detail & Related papers (2023-05-23T01:38:41Z) - Learning in RKHM: a $C^*$-Algebraic Twist for Kernel Machines [13.23700804428796]
Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years.
We provide a new twist by generalizing supervised learning in RKHS and vvRKHS to reproducing kernel Hilbert $C*$-module (RKHM)
We show how to construct effective positive-definite kernels by considering the perspective of $C*$-algebra.
arXiv Detail & Related papers (2022-10-21T10:23:54Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Optimal policy evaluation using kernel-based temporal difference methods [78.83926562536791]
We use kernel Hilbert spaces for estimating the value function of an infinite-horizon discounted Markov reward process.
We derive a non-asymptotic upper bound on the error with explicit dependence on the eigenvalues of the associated kernel operator.
We prove minimax lower bounds over sub-classes of MRPs.
arXiv Detail & Related papers (2021-09-24T14:48:20Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Estimating Koopman operators for nonlinear dynamical systems: a
nonparametric approach [77.77696851397539]
The Koopman operator is a mathematical tool that allows for a linear description of non-linear systems.
In this paper we capture their core essence as a dual version of the same framework, incorporating them into the Kernel framework.
We establish a strong link between kernel methods and Koopman operators, leading to the estimation of the latter through Kernel functions.
arXiv Detail & Related papers (2021-03-25T11:08:26Z) - Reproducing kernel Hilbert C*-module and kernel mean embeddings [12.268585269921404]
We propose a novel data analysis framework with reproducing kernel Hilbert $C*$-module (RKHM) and kernel mean embedding (KME) in RKHM.
We show a branch of theories for RKHM to apply to data analysis, including the representer theorem, and the injectivity and generalize of the proposed KME.
arXiv Detail & Related papers (2021-01-27T14:02:18Z) - Mathematical foundations of stable RKHSs [1.52292571922932]
Reproducing kernel Hilbert spaces (RKHSs) are key spaces for machine learning that are becoming popular also for linear system identification.
In this paper we provide new structural properties of stable RKHSs.
arXiv Detail & Related papers (2020-05-06T17:25:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.