R\'enyi divergence inequalities via interpolation, with applications to
generalised entropic uncertainty relations
- URL: http://arxiv.org/abs/2106.10415v1
- Date: Sat, 19 Jun 2021 04:06:23 GMT
- Title: R\'enyi divergence inequalities via interpolation, with applications to
generalised entropic uncertainty relations
- Authors: Alexander McKinlay
- Abstract summary: We investigate quantum R'enyi entropic quantities, specifically those derived from'sandwiched' divergence.
We present R'enyi mutual information decomposition rules, a new approach to the R'enyi conditional entropy tripartite chain rules and a more general bipartite comparison.
- Score: 91.3755431537592
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We investigate quantum R\'enyi entropic quantities, specifically those
derived from 'sandwiched' divergence. This divergence is one of several
proposed R\'enyi generalisations of the quantum relative entropy. We may define
R\'enyi generalisations of the quantum conditional entropy and mutual
information in terms of this divergence, from which they inherit many desirable
properties. However, these quantities lack some of the convenient structure of
their Shannon and von Neumann counterparts. We attempt to bridge this gap by
establishing divergence inequalities for valid combinations of R\'enyi order
which replicate the chain and decomposition rules of Shannon and von Neumann
entropies. Although weaker in general, these inequalities recover equivalence
when the R\'enyi parameters tend to one.
To this end we present R\'enyi mutual information decomposition rules, a new
approach to the R\'enyi conditional entropy tripartite chain rules and a more
general bipartite comparison. The derivation of these results relies on a novel
complex interpolation approach for general spaces of linear operators.
These new comparisons allow us to employ techniques that until now were only
available for Shannon and von Neumann entropies. We can therefore directly
apply them to the derivation of R\'enyi entropic uncertainty relations.
Accordingly, we establish a family of R\'enyi information exclusion relations
and provide further generalisations and improvements to this and other known
relations, including the R\'enyi bipartite uncertainty relations.
Related papers
- Nonparametric Partial Disentanglement via Mechanism Sparsity: Sparse
Actions, Interventions and Sparse Temporal Dependencies [58.179981892921056]
This work introduces a novel principle for disentanglement we call mechanism sparsity regularization.
We propose a representation learning method that induces disentanglement by simultaneously learning the latent factors.
We show that the latent factors can be recovered by regularizing the learned causal graph to be sparse.
arXiv Detail & Related papers (2024-01-10T02:38:21Z) - Quantum Rényi and $f$-divergences from integral representations [11.74020933567308]
Smooth Csisz'ar $f$-divergences can be expressed as integrals over so-called hockey stick divergences.
We find that the R'enyi divergences defined via our new quantum $f$-divergences are not additive in general.
We derive various inequalities, including new reverse Pinsker inequalities with applications in differential privacy.
arXiv Detail & Related papers (2023-06-21T15:39:38Z) - An entropic uncertainty principle for mixed states [0.0]
We provide a family of generalizations of the entropic uncertainty principle.
Results can be used to certify entanglement between trusted parties, or to bound the entanglement of a system with an untrusted environment.
arXiv Detail & Related papers (2023-03-20T18:31:53Z) - On the Importance of Gradient Norm in PAC-Bayesian Bounds [92.82627080794491]
We propose a new generalization bound that exploits the contractivity of the log-Sobolev inequalities.
We empirically analyze the effect of this new loss-gradient norm term on different neural architectures.
arXiv Detail & Related papers (2022-10-12T12:49:20Z) - Function-space regularized R\'enyi divergences [6.221019624345409]
We propose a new family of regularized R'enyi divergences parametrized by a variational function space.
We prove several properties of these new divergences, showing that they interpolate between the classical R'enyi divergences and IPMs.
We show that the proposed regularized R'enyi divergences inherit features from IPMs such as the ability to compare distributions that are not absolutely continuous.
arXiv Detail & Related papers (2022-10-10T19:18:04Z) - On the Kullback-Leibler divergence between pairwise isotropic
Gaussian-Markov random fields [93.35534658875731]
We derive expressions for the Kullback-Leibler divergence between two pairwise isotropic Gaussian-Markov random fields.
The proposed equation allows the development of novel similarity measures in image processing and machine learning applications.
arXiv Detail & Related papers (2022-03-24T16:37:24Z) - Interpolation can hurt robust generalization even when there is no noise [76.3492338989419]
We show that avoiding generalization through ridge regularization can significantly improve generalization even in the absence of noise.
We prove this phenomenon for the robust risk of both linear regression and classification and hence provide the first theoretical result on robust overfitting.
arXiv Detail & Related papers (2021-08-05T23:04:15Z) - An Online Learning Approach to Interpolation and Extrapolation in Domain
Generalization [53.592597682854944]
We recast generalization over sub-groups as an online game between a player minimizing risk and an adversary presenting new test.
We show that ERM is provably minimax-optimal for both tasks.
arXiv Detail & Related papers (2021-02-25T19:06:48Z) - Variational approach to relative entropies (with application to QFT) [0.0]
We define a new divergence of von Neumann algebras using a variational expression that is similar in nature to Kosaki's formula for the relative entropy.
Our divergence satisfies the usual desirable properties, upper bounds the sandwiched Renyi entropy and reduces to the fidelity in a limit.
arXiv Detail & Related papers (2020-09-10T17:41:28Z) - Entropy and relative entropy from information-theoretic principles [24.74754293747645]
We find that every relative entropy must lie between the R'enyi divergences of order $0$ and $infty$.
Our main result is a one-to-one correspondence between entropies and relative entropies.
arXiv Detail & Related papers (2020-06-19T14:50:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.