Entropy and relative entropy from information-theoretic principles
- URL: http://arxiv.org/abs/2006.11164v2
- Date: Wed, 5 May 2021 16:20:21 GMT
- Title: Entropy and relative entropy from information-theoretic principles
- Authors: Gilad Gour, Marco Tomamichel
- Abstract summary: We find that every relative entropy must lie between the R'enyi divergences of order $0$ and $infty$.
Our main result is a one-to-one correspondence between entropies and relative entropies.
- Score: 24.74754293747645
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce an axiomatic approach to entropies and relative entropies that
relies only on minimal information-theoretic axioms, namely monotonicity under
mixing and data-processing as well as additivity for product distributions. We
find that these axioms induce sufficient structure to establish continuity in
the interior of the probability simplex and meaningful upper and lower bounds,
e.g., we find that every relative entropy must lie between the R\'enyi
divergences of order $0$ and $\infty$. We further show simple conditions for
positive definiteness of such relative entropies and a characterisation in term
of a variant of relative trumping. Our main result is a one-to-one
correspondence between entropies and relative entropies.
Related papers
- Tight relations and equivalences between smooth relative entropies [12.699007098398805]
We show that the hypothesis testing relative entropy is equivalent to a variant of the smooth max-relative entropy based on the information spectrum divergence.
We also introduce a modified proof technique based on matrix geometric means and a tightened gentle measurement lemma.
arXiv Detail & Related papers (2025-01-21T19:00:05Z) - The Limits of Pure Exploration in POMDPs: When the Observation Entropy is Enough [40.82741665804367]
We study a simple approach of maximizing the entropy over observations in place true latent states.
We show how knowledge of the latter can be exploited to compute a regularization of the observation entropy to improve principled performance.
arXiv Detail & Related papers (2024-06-18T17:00:13Z) - Fidelity-Based Smooth Min-Relative Entropy: Properties and Applications [5.211732144306638]
We show that the fidelity-based smooth min-relative entropy satisfies several basic properties, including the data-processing inequality.
We also show how the fidelity-based smooth min-relative entropy provides one-shot bounds for operational tasks in general resource theories.
arXiv Detail & Related papers (2023-05-10T03:09:20Z) - Tight Exponential Analysis for Smoothing the Max-Relative Entropy and
for Quantum Privacy Amplification [56.61325554836984]
The max-relative entropy together with its smoothed version is a basic tool in quantum information theory.
We derive the exact exponent for the decay of the small modification of the quantum state in smoothing the max-relative entropy based on purified distance.
arXiv Detail & Related papers (2021-11-01T16:35:41Z) - R\'enyi divergence inequalities via interpolation, with applications to
generalised entropic uncertainty relations [91.3755431537592]
We investigate quantum R'enyi entropic quantities, specifically those derived from'sandwiched' divergence.
We present R'enyi mutual information decomposition rules, a new approach to the R'enyi conditional entropy tripartite chain rules and a more general bipartite comparison.
arXiv Detail & Related papers (2021-06-19T04:06:23Z) - Maximum Entropy Reinforcement Learning with Mixture Policies [54.291331971813364]
We construct a tractable approximation of the mixture entropy using MaxEnt algorithms.
We show that it is closely related to the sum of marginal entropies.
We derive an algorithmic variant of Soft Actor-Critic (SAC) to the mixture policy case and evaluate it on a series of continuous control tasks.
arXiv Detail & Related papers (2021-03-18T11:23:39Z) - Wehrl entropy, entropic uncertainty relations and entanglement [0.0]
We show that the Wehrl-Lieb inequality is closer to equality than the usual Bialynicki-Birula and Mycielski entropic uncertainty relation almost everywhere.
We show how a Wehrl mutual information can be used to obtain a measurable perfect witness for pure state bipartite entanglement.
arXiv Detail & Related papers (2021-03-12T12:12:55Z) - Action Redundancy in Reinforcement Learning [54.291331971813364]
We show that transition entropy can be described by two terms; namely, model-dependent transition entropy and action redundancy.
Our results suggest that action redundancy is a fundamental problem in reinforcement learning.
arXiv Detail & Related papers (2021-02-22T19:47:26Z) - Catalytic Transformations of Pure Entangled States [62.997667081978825]
Entanglement entropy is the von Neumann entropy of quantum entanglement of pure states.
The relation between entanglement entropy and entanglement distillation has been known only for the setting, and the meaning of entanglement entropy in the single-copy regime has so far remained open.
Our results imply that entanglement entropy quantifies the amount of entanglement available in a bipartite pure state to be used for quantum information processing, giving results an operational meaning also in entangled single-copy setup.
arXiv Detail & Related papers (2021-02-22T16:05:01Z) - The variance of relative surprisal as single-shot quantifier [0.0]
We show that (relative) surprisal gives sufficient conditions for approximate state-transitions between pairs of quantum states in single-shot setting.
We further derive a simple and physically appealing axiomatic single-shot characterization of (relative) entropy.
arXiv Detail & Related papers (2020-09-17T16:06:54Z) - Variational approach to relative entropies (with application to QFT) [0.0]
We define a new divergence of von Neumann algebras using a variational expression that is similar in nature to Kosaki's formula for the relative entropy.
Our divergence satisfies the usual desirable properties, upper bounds the sandwiched Renyi entropy and reduces to the fidelity in a limit.
arXiv Detail & Related papers (2020-09-10T17:41:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.