Reproducing kernel Hilbert C*-module and kernel mean embeddings
- URL: http://arxiv.org/abs/2101.11410v1
- Date: Wed, 27 Jan 2021 14:02:18 GMT
- Title: Reproducing kernel Hilbert C*-module and kernel mean embeddings
- Authors: Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura, Takeshi
Katsura, and Yoshinobu Kawahara
- Abstract summary: We propose a novel data analysis framework with reproducing kernel Hilbert $C*$-module (RKHM) and kernel mean embedding (KME) in RKHM.
We show a branch of theories for RKHM to apply to data analysis, including the representer theorem, and the injectivity and generalize of the proposed KME.
- Score: 12.268585269921404
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kernel methods have been among the most popular techniques in machine
learning, where learning tasks are solved using the property of reproducing
kernel Hilbert space (RKHS). In this paper, we propose a novel data analysis
framework with reproducing kernel Hilbert $C^*$-module (RKHM) and kernel mean
embedding (KME) in RKHM. Since RKHM contains richer information than RKHS or
vector-valued RKHS (vv RKHS), analysis with RKHM enables us to capture and
extract structural properties in multivariate data, functional data and other
structured data. We show a branch of theories for RKHM to apply to data
analysis, including the representer theorem, and the injectivity and
universality of the proposed KME. We also show RKHM generalizes RKHS and vv
RKHS. Then, we provide concrete procedures for employing RKHM and the proposed
KME to data analysis.
Related papers
- Convolutional Filtering with RKHS Algebras [110.06688302593349]
We develop a theory of convolutional signal processing and neural networks for Reproducing Kernel Hilbert Spaces (RKHS)
We show that any RKHS allows the formal definition of multiple algebraic convolutional models.
We present a set of numerical experiments on real data in which wireless coverage is predicted from measurements captured by unmaned aerial vehicles.
arXiv Detail & Related papers (2024-11-02T18:53:44Z) - Compressed Online Learning of Conditional Mean Embedding [11.720101697635148]
conditional mean embedding (CME) encodes Markovian kernels through their actions on probability distributions.
We present an algorithm to learn the CME incrementally from data via an operator-valued gradient descent.
arXiv Detail & Related papers (2024-05-13T02:18:49Z) - STEERING: Stein Information Directed Exploration for Model-Based
Reinforcement Learning [111.75423966239092]
We propose an exploration incentive in terms of the integral probability metric (IPM) between a current estimate of the transition model and the unknown optimal.
Based on KSD, we develop a novel algorithm algo: textbfSTEin information dirtextbfEcted exploration for model-based textbfReinforcement LearntextbfING.
arXiv Detail & Related papers (2023-01-28T00:49:28Z) - Learning in RKHM: a $C^*$-Algebraic Twist for Kernel Machines [13.23700804428796]
Supervised learning in reproducing kernel Hilbert space (RKHS) and vector-valued RKHS (vvRKHS) has been investigated for more than 30 years.
We provide a new twist by generalizing supervised learning in RKHS and vvRKHS to reproducing kernel Hilbert $C*$-module (RKHM)
We show how to construct effective positive-definite kernels by considering the perspective of $C*$-algebra.
arXiv Detail & Related papers (2022-10-21T10:23:54Z) - Meta-Learning Hypothesis Spaces for Sequential Decision-making [79.73213540203389]
We propose to meta-learn a kernel from offline data (Meta-KeL)
Under mild conditions, we guarantee that our estimated RKHS yields valid confidence sets.
We also empirically evaluate the effectiveness of our approach on a Bayesian optimization task.
arXiv Detail & Related papers (2022-02-01T17:46:51Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Kernel learning approaches for summarising and combining posterior
similarity matrices [68.8204255655161]
We build upon the notion of the posterior similarity matrix (PSM) in order to suggest new approaches for summarising the output of MCMC algorithms for Bayesian clustering models.
A key contribution of our work is the observation that PSMs are positive semi-definite, and hence can be used to define probabilistically-motivated kernel matrices.
arXiv Detail & Related papers (2020-09-27T14:16:14Z) - Kernel Mean Embeddings of Von Neumann-Algebra-Valued Measures [13.000268576445018]
Kernel mean embedding (KME) is a powerful tool to analyze probability measures for data.
We generalize KME to that of von Neumann-algebra-valued measures into reproducing kernel Hilbert modules (RKHMs)
arXiv Detail & Related papers (2020-07-29T09:26:39Z) - A Kernel-Based Approach to Non-Stationary Reinforcement Learning in
Metric Spaces [53.47210316424326]
KeRNS is an algorithm for episodic reinforcement learning in non-stationary Markov Decision Processes.
We prove a regret bound that scales with the covering dimension of the state-action space and the total variation of the MDP with time.
arXiv Detail & Related papers (2020-07-09T21:37:13Z) - A Mean-Field Theory for Learning the Sch\"{o}nberg Measure of Radial
Basis Functions [13.503048325896174]
We learn the distribution in the Sch"onberg integral representation of the radial basis functions from training samples.
We prove that in the scaling limits, the empirical measure of the Langevin particles converges to the law of a reflected Ito diffusion-drift process.
arXiv Detail & Related papers (2020-06-23T21:04:48Z) - Analysis via Orthonormal Systems in Reproducing Kernel Hilbert
$C^*$-Modules and Applications [12.117553807794382]
We propose a novel data analysis framework with reproducing kernel Hilbert $C*$-module (RKHM)
We show the theoretical validity for the construction of orthonormal systems in Hilbert $C*$-modules, and derive concrete procedures for orthonormalization in RKHMs.
We apply those to generalize with RKHM kernel principal component analysis and the analysis of dynamical systems with Perron-Frobenius operators.
arXiv Detail & Related papers (2020-03-02T10:01:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.