Kernel Mean Embeddings of Von Neumann-Algebra-Valued Measures
- URL: http://arxiv.org/abs/2007.14698v1
- Date: Wed, 29 Jul 2020 09:26:39 GMT
- Title: Kernel Mean Embeddings of Von Neumann-Algebra-Valued Measures
- Authors: Yuka Hashimoto, Isao Ishikawa, Masahiro Ikeda, Fuyuta Komura,
Yoshinobu Kawahara
- Abstract summary: Kernel mean embedding (KME) is a powerful tool to analyze probability measures for data.
We generalize KME to that of von Neumann-algebra-valued measures into reproducing kernel Hilbert modules (RKHMs)
- Score: 13.000268576445018
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Kernel mean embedding (KME) is a powerful tool to analyze probability
measures for data, where the measures are conventionally embedded into a
reproducing kernel Hilbert space (RKHS). In this paper, we generalize KME to
that of von Neumann-algebra-valued measures into reproducing kernel Hilbert
modules (RKHMs), which provides an inner product and distance between von
Neumann-algebra-valued measures. Von Neumann-algebra-valued measures can, for
example, encode relations between arbitrary pairs of variables in a
multivariate distribution or positive operator-valued measures for quantum
mechanics. Thus, this allows us to perform probabilistic analyses explicitly
reflected with higher-order interactions among variables, and provides a way of
applying machine learning frameworks to problems in quantum mechanics. We also
show that the injectivity of the existing KME and the universality of RKHS are
generalized to RKHM, which confirms many useful features of the existing KME
remain in our generalized KME. And, we investigate the empirical performance of
our methods using synthetic and real-world data.
Related papers
- Estimating molecular thermal averages with the quantum equation of motion and informationally complete measurements [0.0]
We use the Variational Quantum Eigensolver (VQE) to compute thermal averages of quantum systems.
A drawback of qEOM is that it requires measuring the expectation values of a large number of observables on the ground state of the system.
In this work we focus on measurements through informationally complete positive operator-valued measures (IC-POVMs) to achieve a reduction in the measurements overheads.
arXiv Detail & Related papers (2024-06-06T20:02:24Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Adaptive Log-Euclidean Metrics for SPD Matrix Learning [73.12655932115881]
We propose Adaptive Log-Euclidean Metrics (ALEMs), which extend the widely used Log-Euclidean Metric (LEM)
The experimental and theoretical results demonstrate the merit of the proposed metrics in improving the performance of SPD neural networks.
arXiv Detail & Related papers (2023-03-26T18:31:52Z) - Targeted Separation and Convergence with Kernel Discrepancies [61.973643031360254]
kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or (ii) control weak convergence to P.
In this article we derive new sufficient and necessary conditions to ensure (i) and (ii)
For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels.
arXiv Detail & Related papers (2022-09-26T16:41:16Z) - Revisiting Memory Efficient Kernel Approximation: An Indefinite Learning
Perspective [0.8594140167290097]
Matrix approximations are a key element in large-scale machine learning approaches.
We extend MEKA to be applicable not only for shift-invariant kernels but also for non-stationary kernels.
We present a Lanczos-based estimation of a spectrum shift to develop a stable positive semi-definite MEKA approximation.
arXiv Detail & Related papers (2021-12-18T10:01:34Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Reproducing kernel Hilbert C*-module and kernel mean embeddings [12.268585269921404]
We propose a novel data analysis framework with reproducing kernel Hilbert $C*$-module (RKHM) and kernel mean embedding (KME) in RKHM.
We show a branch of theories for RKHM to apply to data analysis, including the representer theorem, and the injectivity and generalize of the proposed KME.
arXiv Detail & Related papers (2021-01-27T14:02:18Z) - Minimal informationally complete measurements for probability
representation of quantum dynamics [0.0]
We suggest an approach for describing dynamics of finite-dimensional quantum systems in terms of pseudostochastic maps acting on probability distributions.
A key advantage of the suggested approach is that minimal informationally complete positive operator-valued measures (MIC-POVMs) are easier to construct in comparison with their symmetric versions (SIC-POVMs)
We apply the MIC-POVM-based probability representation to the digital quantum computing model.
arXiv Detail & Related papers (2020-06-22T13:19:23Z) - Analysis via Orthonormal Systems in Reproducing Kernel Hilbert
$C^*$-Modules and Applications [12.117553807794382]
We propose a novel data analysis framework with reproducing kernel Hilbert $C*$-module (RKHM)
We show the theoretical validity for the construction of orthonormal systems in Hilbert $C*$-modules, and derive concrete procedures for orthonormalization in RKHMs.
We apply those to generalize with RKHM kernel principal component analysis and the analysis of dynamical systems with Perron-Frobenius operators.
arXiv Detail & Related papers (2020-03-02T10:01:14Z) - Generalized Sliced Distances for Probability Distributions [47.543990188697734]
We introduce a broad family of probability metrics, coined as Generalized Sliced Probability Metrics (GSPMs)
GSPMs are rooted in the generalized Radon transform and come with a unique geometric interpretation.
We consider GSPM-based gradient flows for generative modeling applications and show that under mild assumptions, the gradient flow converges to the global optimum.
arXiv Detail & Related papers (2020-02-28T04:18:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.