Kullback-Leibler divergence between quantum distributions, and its
upper-bound
- URL: http://arxiv.org/abs/2008.05932v3
- Date: Thu, 10 Dec 2020 12:39:33 GMT
- Title: Kullback-Leibler divergence between quantum distributions, and its
upper-bound
- Authors: Vincenzo Bonnici
- Abstract summary: This work presents an upper-bound to value that the Kullback-Leibler (KL) divergence can reach for a class of probability distributions called quantum distributions (QD)
The retrieving of an upper-bound for the entropic divergence is here shown to be possible under the condition that the compared distributions are quantum distributions over the same quantum value, thus they become comparable.
- Score: 1.2183405753834562
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work presents an upper-bound to value that the Kullback-Leibler (KL)
divergence can reach for a class of probability distributions called quantum
distributions (QD). The aim is to find a distribution $U$ which maximizes the
KL divergence from a given distribution $P$ under the assumption that $P$ and
$U$ have been generated by distributing a given discrete quantity, a quantum.
Quantum distributions naturally represent a wide range of probability
distributions that are used in practical applications. Moreover, such a class
of distributions can be obtained as an approximation of any probability
distribution. The retrieving of an upper-bound for the entropic divergence is
here shown to be possible under the condition that the compared distributions
are quantum distributions over the same quantum value, thus they become
comparable. Thus, entropic divergence acquires a more powerful meaning when it
is applied to comparable distributions. This aspect should be taken into
account in future developments of divergences. The theoretical findings are
used for proposing a notion of normalized KL divergence that is empirically
shown to behave differently from already known measures.
Related papers
- Efficient quantum loading of probability distributions through Feynman
propagators [2.56711111236449]
We present quantum algorithms for the loading of probability distributions using Hamiltonian simulation for one dimensional Hamiltonians of the form $hat H= Delta + V(x) mathbbI$.
We consider the potentials $V(x)$ for which the Feynman propagator is known to have an analytically closed form and utilize these Hamiltonians to load probability distributions into quantum states.
arXiv Detail & Related papers (2023-11-22T21:41:58Z) - Computing Marginal and Conditional Divergences between Decomposable
Models with Applications [7.89568731669979]
We propose an approach to compute the exact alpha-beta divergence between any marginal or conditional distribution of two decomposable models.
We show how our method can be used to analyze distributional changes by first applying it to a benchmark image dataset.
Based on our framework, we propose a novel way to quantify the error in contemporary superconducting quantum computers.
arXiv Detail & Related papers (2023-10-13T14:17:25Z) - Normal quantum channels and Markovian correlated two-qubit quantum
errors [77.34726150561087]
We study general normally'' distributed random unitary transformations.
On the one hand, a normal distribution induces a unital quantum channel.
On the other hand, the diffusive random walk defines a unital quantum process.
arXiv Detail & Related papers (2023-07-25T15:33:28Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Entangled probability distributions [1.2891210250935143]
Concept of entangled probability distribution of several random variables is introduced.
These probability distributions describe multimode quantum states in probability representation of quantum mechanics.
Example of entangled probability distribution is considered.
arXiv Detail & Related papers (2023-02-25T11:43:21Z) - Score-Based Diffusion meets Annealed Importance Sampling [89.92133671626327]
Annealed Importance Sampling remains one of the most effective methods for marginal likelihood estimation.
We leverage recent progress in score-based generative modeling to approximate the optimal extended target distribution for AIS proposals.
arXiv Detail & Related papers (2022-08-16T12:13:29Z) - Robust Learning of Optimal Auctions [84.13356290199603]
We study the problem of learning revenue-optimal multi-bidder auctions from samples when the samples of bidders' valuations can be adversarially corrupted or drawn from distributions that are adversarially perturbed.
We propose new algorithms that can learn a mechanism whose revenue is nearly optimal simultaneously for all true distributions'' that are $alpha$-close to the original distribution in Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2021-07-13T17:37:21Z) - Parametrization invariant interpretation of priors and posteriors [0.0]
We move away from the idea that "a prior distribution establishes a probability distribution over the parameters of our model" to the idea that "a prior distribution establishes a probability distribution over probability distributions"
Under this mindset, any distribution over probability distributions should be "intrinsic", that is, invariant to the specific parametrization which is selected for the manifold.
arXiv Detail & Related papers (2021-05-18T06:45:05Z) - $\alpha$-Geodesical Skew Divergence [5.3556221126231085]
The asymmetric skew divergence smooths one of the distributions by mixing it, to a degree determined by the parameter $lambda$, with the other distribution.
Such divergence is an approximation of the KL divergence that does not require the target distribution to be absolutely continuous with respect to the source distribution.
arXiv Detail & Related papers (2021-03-31T13:27:58Z) - Linear Optimal Transport Embedding: Provable Wasserstein classification
for certain rigid transformations and perturbations [79.23797234241471]
Discriminating between distributions is an important problem in a number of scientific fields.
The Linear Optimal Transportation (LOT) embeds the space of distributions into an $L2$-space.
We demonstrate the benefits of LOT on a number of distribution classification problems.
arXiv Detail & Related papers (2020-08-20T19:09:33Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.