Finite sample approximations of exact and entropic Wasserstein distances
between covariance operators and Gaussian processes
- URL: http://arxiv.org/abs/2104.12368v1
- Date: Mon, 26 Apr 2021 06:57:14 GMT
- Title: Finite sample approximations of exact and entropic Wasserstein distances
between covariance operators and Gaussian processes
- Authors: Minh Ha Quang
- Abstract summary: We show that the Sinkhorn divergence between two centered Gaussian processes can be consistently and efficiently estimated.
For a fixed regularization parameter, the convergence rates are it dimension-independent and of the same order as those for the Hilbert-Schmidt distance.
If at least one of the RKHS is finite-dimensional, we obtain a it dimension-dependent sample complexity for the exact Wasserstein distance between the Gaussian processes.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work studies finite sample approximations of the exact and entropic
regularized Wasserstein distances between centered Gaussian processes and, more
generally, covariance operators of functional random processes. We first show
that these distances/divergences are fully represented by reproducing kernel
Hilbert space (RKHS) covariance and cross-covariance operators associated with
the corresponding covariance functions. Using this representation, we show that
the Sinkhorn divergence between two centered Gaussian processes can be
consistently and efficiently estimated from the divergence between their
corresponding normalized finite-dimensional covariance matrices, or
alternatively, their sample covariance operators. Consequently, this leads to a
consistent and efficient algorithm for estimating the Sinkhorn divergence from
finite samples generated by the two processes. For a fixed regularization
parameter, the convergence rates are {\it dimension-independent} and of the
same order as those for the Hilbert-Schmidt distance. If at least one of the
RKHS is finite-dimensional, we obtain a {\it dimension-dependent} sample
complexity for the exact Wasserstein distance between the Gaussian processes.
Related papers
- In-and-Out: Algorithmic Diffusion for Sampling Convex Bodies [7.70133333709347]
We present a new random walk for uniformly sampling high-dimensional convex bodies.
It achieves state-of-the-art runtime complexity with stronger guarantees on the output.
arXiv Detail & Related papers (2024-05-02T16:15:46Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Kullback-Leibler and Renyi divergences in reproducing kernel Hilbert
space and Gaussian process settings [0.0]
We present formulations for regularized Kullback-Leibler and R'enyi divergences via the Alpha Log-Determinant (Log-Det) divergences.
For characteristic kernels, the first setting leads to divergences between arbitrary Borel probability measures on a complete, separable metric space.
We show that the Alpha Log-Det divergences are continuous in the Hilbert-Schmidt norm, which enables us to apply laws of large numbers for Hilbert space-valued random variables.
arXiv Detail & Related papers (2022-07-18T06:40:46Z) - A Stochastic Newton Algorithm for Distributed Convex Optimization [62.20732134991661]
We analyze a Newton algorithm for homogeneous distributed convex optimization, where each machine can calculate gradients of the same population objective.
We show that our method can reduce the number, and frequency, of required communication rounds compared to existing methods without hurting performance.
arXiv Detail & Related papers (2021-10-07T17:51:10Z) - Estimation of Riemannian distances between covariance operators and
Gaussian processes [0.7360807642941712]
We study two distances between infinite-dimensional positive definite Hilbert-Schmidt operators.
Results show that both distances converge in the Hilbert-Schmidt norm.
arXiv Detail & Related papers (2021-08-26T09:57:47Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Convergence and finite sample approximations of entropic regularized
Wasserstein distances in Gaussian and RKHS settings [0.0]
We study the convergence and finite sample approximations of entropic regularized Wasserstein distances in the Hilbert space setting.
For Gaussian measures on an infinite-dimensional Hilbert space, convergence in the 2-Sinkhorn divergence is weaker than convergence in the exact 2-Wasserstein distance.
arXiv Detail & Related papers (2021-01-05T09:46:58Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - Faster Wasserstein Distance Estimation with the Sinkhorn Divergence [0.0]
The squared Wasserstein distance is a quantity to compare probability distributions in a non-parametric setting.
In this work, we propose instead to estimate it with the Sinkhorn divergence.
We show that, for smooth densities, this estimator has a comparable sample complexity but allows higher regularization levels.
arXiv Detail & Related papers (2020-06-15T06:58:16Z) - The Convergence Indicator: Improved and completely characterized
parameter bounds for actual convergence of Particle Swarm Optimization [68.8204255655161]
We introduce a new convergence indicator that can be used to calculate whether the particles will finally converge to a single point or diverge.
Using this convergence indicator we provide the actual bounds completely characterizing parameter regions that lead to a converging swarm.
arXiv Detail & Related papers (2020-06-06T19:08:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.