Kullback-Leibler and Renyi divergences in reproducing kernel Hilbert
space and Gaussian process settings
- URL: http://arxiv.org/abs/2207.08406v1
- Date: Mon, 18 Jul 2022 06:40:46 GMT
- Title: Kullback-Leibler and Renyi divergences in reproducing kernel Hilbert
space and Gaussian process settings
- Authors: Minh Ha Quang
- Abstract summary: We present formulations for regularized Kullback-Leibler and R'enyi divergences via the Alpha Log-Determinant (Log-Det) divergences.
For characteristic kernels, the first setting leads to divergences between arbitrary Borel probability measures on a complete, separable metric space.
We show that the Alpha Log-Det divergences are continuous in the Hilbert-Schmidt norm, which enables us to apply laws of large numbers for Hilbert space-valued random variables.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we present formulations for regularized Kullback-Leibler and
R\'enyi divergences via the Alpha Log-Determinant (Log-Det) divergences between
positive Hilbert-Schmidt operators on Hilbert spaces in two different settings,
namely (i) covariance operators and Gaussian measures defined on reproducing
kernel Hilbert spaces (RKHS); and (ii) Gaussian processes with squared
integrable sample paths. For characteristic kernels, the first setting leads to
divergences between arbitrary Borel probability measures on a complete,
separable metric space. We show that the Alpha Log-Det divergences are
continuous in the Hilbert-Schmidt norm, which enables us to apply laws of large
numbers for Hilbert space-valued random variables. As a consequence of this, we
show that, in both settings, the infinite-dimensional divergences can be
consistently and efficiently estimated from their finite-dimensional versions,
using finite-dimensional Gram matrices/Gaussian measures and finite sample
data, with {\it dimension-independent} sample complexities in all cases. RKHS
methodology plays a central role in the theoretical analysis in both settings.
The mathematical formulation is illustrated by numerical experiments.
Related papers
- Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Conditioning of Banach Space Valued Gaussian Random Variables: An Approximation Approach Based on Martingales [8.81121308982678]
We investigate the conditional distributions of two Banach space valued, jointly Gaussian random variables.
We show that their means and covariances are determined by a general finite dimensional approximation scheme based upon a martingale approach.
arXiv Detail & Related papers (2024-04-04T13:57:44Z) - Continuous percolation in a Hilbert space for a large system of qubits [58.720142291102135]
The percolation transition is defined through the appearance of the infinite cluster.
We show that the exponentially increasing dimensionality of the Hilbert space makes its covering by finite-size hyperspheres inefficient.
Our approach to the percolation transition in compact metric spaces may prove useful for its rigorous treatment in other contexts.
arXiv Detail & Related papers (2022-10-15T13:53:21Z) - Page curves and typical entanglement in linear optics [0.0]
We study entanglement within a set of squeezed modes that have been evolved by a random linear optical unitary.
We prove various results on the typicality of entanglement as measured by the R'enyi-2 entropy.
Our main make use of a symmetry property obeyed by the average and the variance of the entropy that dramatically simplifies the averaging over unitaries.
arXiv Detail & Related papers (2022-09-14T18:00:03Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Estimation of Riemannian distances between covariance operators and
Gaussian processes [0.7360807642941712]
We study two distances between infinite-dimensional positive definite Hilbert-Schmidt operators.
Results show that both distances converge in the Hilbert-Schmidt norm.
arXiv Detail & Related papers (2021-08-26T09:57:47Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Convergence and finite sample approximations of entropic regularized
Wasserstein distances in Gaussian and RKHS settings [0.0]
We study the convergence and finite sample approximations of entropic regularized Wasserstein distances in the Hilbert space setting.
For Gaussian measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn divergence is weaker than convergence in the exact 2-Wasserstein distance.
arXiv Detail & Related papers (2021-01-05T09:46:58Z) - Entropic regularization of Wasserstein distance between
infinite-dimensional Gaussian measures and Gaussian processes [0.0]
This work studies the entropic regularization formulation of the 2-Wasserstein distance on an infinite-dimensional Hilbert space.
In the infinite-dimensional setting, both the entropic 2-Wasserstein distance and Sinkhorn divergence are Fr'echet differentiable, in contrast to the exact 2-Wasserstein distance.
arXiv Detail & Related papers (2020-11-15T10:03:12Z) - Hilbert-space geometry of random-matrix eigenstates [55.41644538483948]
We discuss the Hilbert-space geometry of eigenstates of parameter-dependent random-matrix ensembles.
Our results give the exact joint distribution function of the Fubini-Study metric and the Berry curvature.
We compare our results to numerical simulations of random-matrix ensembles as well as electrons in a random magnetic field.
arXiv Detail & Related papers (2020-11-06T19:00:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.