Independent Gaussian Distributions Minimize the Kullback-Leibler (KL)
Divergence from Independent Gaussian Distributions
- URL: http://arxiv.org/abs/2011.02560v2
- Date: Thu, 3 Dec 2020 15:54:17 GMT
- Title: Independent Gaussian Distributions Minimize the Kullback-Leibler (KL)
Divergence from Independent Gaussian Distributions
- Authors: Song Fang and Quanyan Zhu
- Abstract summary: This note is on a property of the Kullback-Leibler (KL) divergence.
The primary purpose of this note is for the referencing of papers that need to make use of this property entirely or partially.
- Score: 23.249999313567624
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This short note is on a property of the Kullback-Leibler (KL) divergence
which indicates that independent Gaussian distributions minimize the KL
divergence from given independent Gaussian distributions. The primary purpose
of this note is for the referencing of papers that need to make use of this
property entirely or partially.
Related papers
- Statistical and Geometrical properties of regularized Kernel Kullback-Leibler divergence [7.273481485032721]
We study the statistical and geometrical properties of the Kullback-Leibler divergence with kernel covariance operators introduced by Bach [2022]
Unlike the classical Kullback-Leibler (KL) divergence that involves density ratios, the KKL compares probability distributions through covariance operators (embeddings) in a reproducible kernel Hilbert space (RKHS)
This novel divergence hence shares parallel but different aspects with both the standard Kullback-Leibler between probability distributions and kernel embeddings metrics such as the maximum mean discrepancy.
arXiv Detail & Related papers (2024-08-29T14:01:30Z) - A Distributional Analogue to the Successor Representation [54.99439648059807]
This paper contributes a new approach for distributional reinforcement learning.
It elucidates a clean separation of transition structure and reward in the learning process.
As an illustration, we show that it enables zero-shot risk-sensitive policy evaluation.
arXiv Detail & Related papers (2024-02-13T15:35:24Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Wrapped Distributions on homogeneous Riemannian manifolds [58.720142291102135]
Control over distributions' properties, such as parameters, symmetry and modality yield a family of flexible distributions.
We empirically validate our approach by utilizing our proposed distributions within a variational autoencoder and a latent space network model.
arXiv Detail & Related papers (2022-04-20T21:25:21Z) - Robust Estimation for Nonparametric Families via Generative Adversarial
Networks [92.64483100338724]
We provide a framework for designing Generative Adversarial Networks (GANs) to solve high dimensional robust statistics problems.
Our work extend these to robust mean estimation, second moment estimation, and robust linear regression.
In terms of techniques, our proposed GAN losses can be viewed as a smoothed and generalized Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2022-02-02T20:11:33Z) - Non-Gaussian Component Analysis via Lattice Basis Reduction [56.98280399449707]
Non-Gaussian Component Analysis (NGCA) is a distribution learning problem.
We provide an efficient algorithm for NGCA in the regime that $A$ is discrete or nearly discrete.
arXiv Detail & Related papers (2021-12-16T18:38:02Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint
Support [27.165565512841656]
We study the gradient flow for a relaxed approximation to the Kullback-Leibler divergence between a moving source and a fixed target distribution.
This approximation, termed the KALE (KL approximate lower-bound estimator), solves a regularized version of the Fenchel dual problem defining the KL over a restricted class of functions.
arXiv Detail & Related papers (2021-06-16T16:37:43Z) - $\alpha$-Geodesical Skew Divergence [5.3556221126231085]
The asymmetric skew divergence smooths one of the distributions by mixing it, to a degree determined by the parameter $lambda$, with the other distribution.
Such divergence is an approximation of the KL divergence that does not require the target distribution to be absolutely continuous with respect to the source distribution.
arXiv Detail & Related papers (2021-03-31T13:27:58Z) - Independent Elliptical Distributions Minimize Their $\mathcal{W}_2$
Wasserstein Distance from Independent Elliptical Distributions with the Same
Density Generator [30.590501280252948]
This note is on a property of the $mathcalW$ Wasserstein distance.
It indicates that independent elliptical distributions minimize their $mathcalW$ Wasserstein distance from given independent elliptical distributions with the same density generators.
arXiv Detail & Related papers (2020-12-07T15:52:02Z) - Kullback-Leibler divergence between quantum distributions, and its
upper-bound [1.2183405753834562]
This work presents an upper-bound to value that the Kullback-Leibler (KL) divergence can reach for a class of probability distributions called quantum distributions (QD)
The retrieving of an upper-bound for the entropic divergence is here shown to be possible under the condition that the compared distributions are quantum distributions over the same quantum value, thus they become comparable.
arXiv Detail & Related papers (2020-08-13T14:42:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.