Which entropy for general physical theories?
- URL: http://arxiv.org/abs/2302.01651v3
- Date: Thu, 23 May 2024 14:00:26 GMT
- Title: Which entropy for general physical theories?
- Authors: Paolo Perinotti, Alessandro Tosini, Leonardo Vaglini,
- Abstract summary: We address the problem of quantifying the information content of a source for an arbitrary information theory.
The functions that solve this problem in classical and quantum theory are Shannon's and von Neumann's entropy, respectively.
In a general information theory there are three different functions that extend the notion of entropy, and this opens the question as to whether any of them can universally play the role of the quantifier for the information content.
- Score: 44.99833362998488
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We address the problem of quantifying the information content of a source for an arbitrary information theory, where the information content is defined in terms of the asymptotic achievable compression rate. The functions that solve this problem in classical and quantum theory are Shannon's and von Neumann's entropy, respectively. However, in a general information theory there are three different functions that extend the notion of entropy, and this opens the question as to whether any of them can universally play the role of the quantifier for the information content. Here we answer the question in the negative, by evaluating the information content as well as the various entropic functions in a toy theory called Bilocal Classical Theory.
Related papers
- Uncertainty relations in terms of generalized entropies derived from
information diagrams [0.0]
Inequalities between entropies and the index of coincidence form a long-standing direction of researches in classical information theory.
This paper is devoted to entropic uncertainty relations derived from information diagrams.
arXiv Detail & Related papers (2023-05-29T10:41:28Z) - Chain Rules for Renyi Information Combining [14.824891788575421]
Bounds on information combining are a fundamental tool in coding theory.
This work provides new information combining bounds for the Arimoto Renyi entropy.
In the second part, we generalize the chain rule to the quantum setting and show how they allow us to generalize results and conjectures.
arXiv Detail & Related papers (2023-05-04T06:47:50Z) - One-shot and asymptotic classical capacity in general physical theories [0.0]
We consider hypothesis testing relative entropy and one-shot classical capacity, that is, the optimal rate of classical information transmitted.
Applying the above two bounds, we prove the equivalence between classical capacity and hypothesis testing relative entropy even in any general physical theory.
arXiv Detail & Related papers (2023-03-07T18:52:17Z) - Is there a finite complete set of monotones in any quantum resource theory? [39.58317527488534]
We show that there does not exist a finite set of resource monotones which completely determines all state transformations.
We show that totally ordered theories allow for free transformations between all pure states.
arXiv Detail & Related papers (2022-12-05T18:28:36Z) - Inevitability of knowing less than nothing [10.674604700001966]
In the classical world, entropy and conditional entropy take only non-negative values.
We introduce a physically motivated framework for defining quantum conditional entropy.
arXiv Detail & Related papers (2022-08-30T17:44:17Z) - Shannon theory beyond quantum: information content of a source [68.8204255655161]
We extend the definition of information content to operational probabilistic theories.
We prove relevant properties as the subadditivity, and the relation between purity and information content of a state.
arXiv Detail & Related papers (2021-12-23T16:36:06Z) - Entropy, Divergence, and Majorization in Classical and Quantum
Thermodynamics [0.0]
It has been revealed that there is rich information-theoretic structure in thermodynamics of out-of-equilibrium systems in both the classical and quantum regimes.
Main purpose of this book is to clarify how information theory works behind thermodynamics and to shed modern light on it.
arXiv Detail & Related papers (2020-07-20T09:50:27Z) - Emergence of classical behavior in the early universe [68.8204255655161]
Three notions are often assumed to be essentially equivalent, representing different facets of the same phenomenon.
We analyze them in general Friedmann-Lemaitre- Robertson-Walker space-times through the lens of geometric structures on the classical phase space.
The analysis shows that: (i) inflation does not play an essential role; classical behavior can emerge much more generally; (ii) the three notions are conceptually distinct; classicality can emerge in one sense but not in another.
arXiv Detail & Related papers (2020-04-22T16:38:25Z) - From Thermodynamic Sufficiency to Information Causality [0.0]
We derive information causality from monotonicity of divergence.
We conjecture that under very weak regularity conditions it can be used to deduce the complex Hilbert space formalism of quantum theory.
arXiv Detail & Related papers (2020-02-07T16:53:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.