Introduction to Logical Entropy and its Relationship to Shannon Entropy
- URL: http://arxiv.org/abs/2112.01966v1
- Date: Fri, 3 Dec 2021 15:16:46 GMT
- Title: Introduction to Logical Entropy and its Relationship to Shannon Entropy
- Authors: David Ellerman
- Abstract summary: Logical entropy is the natural measure of the notion of information based on distinctions, differences, distinguishability, and diversity.
It is the (normalized) quantitative measure of the distinctions of a partition on a set--just as the Boole-Laplace logical probability is the normalized quantitative measure of the elements of a subset of a set.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We live in the information age. Claude Shannon, as the father of the
information age, gave us a theory of communications that quantified an "amount
of information," but, as he pointed out, "no concept of information itself was
defined." Logical entropy provides that definition. Logical entropy is the
natural measure of the notion of information based on distinctions,
differences, distinguishability, and diversity. It is the (normalized)
quantitative measure of the distinctions of a partition on a set--just as the
Boole-Laplace logical probability is the normalized quantitative measure of the
elements of a subset of a set. And partitions and subsets are mathematically
dual concepts--so the logic of partitions is dual in that sense to the usual
Boolean logic of subsets, and hence the name "logical entropy." The logical
entropy of a partition has a simple interpretation as the probability that a
distinction or dit (elements in different blocks) is obtained in two
independent draws from the underlying set. The Shannon entropy is shown to also
be based on this notion of information-as-distinctions; it is the average
minimum number of binary partitions (bits) that need to be joined to make all
the same distinctions of the given partition. Hence all the concepts of simple,
joint, conditional, and mutual logical entropy can be transformed into the
corresponding concepts of Shannon entropy by a uniform non-linear dit-bit
transform. And finally logical entropy linearizes naturally to the
corresponding quantum concept. The quantum logical entropy of an observable
applied to a state is the probability that two different eigenvalues are
obtained in two independent projective measurements of that observable on that
state.
Keywords: logical entropy, Shannon entropy, partitions, MaxEntropy, quantum
logical entropy, von Neumann entropy
Related papers
- Generalized Quantum Stein's Lemma and Second Law of Quantum Resource Theories [47.02222405817297]
A fundamental question in quantum information theory is whether an analogous second law can be formulated to characterize the convertibility of resources for quantum information processing by a single function.
In 2008, a promising formulation was proposed, linking resource convertibility to the optimal performance of a variant of the quantum version of hypothesis testing.
In 2023, a logical gap was found in the original proof of this lemma, casting doubt on the possibility of such a formulation of the second law.
arXiv Detail & Related papers (2024-08-05T18:00:00Z) - Which entropy for general physical theories? [44.99833362998488]
We address the problem of quantifying the information content of a source for an arbitrary information theory.
The functions that solve this problem in classical and quantum theory are Shannon's and von Neumann's entropy, respectively.
In a general information theory there are three different functions that extend the notion of entropy, and this opens the question as to whether any of them can universally play the role of the quantifier for the information content.
arXiv Detail & Related papers (2023-02-03T10:55:13Z) - Inevitability of knowing less than nothing [5.767156832161818]
In the classical world, entropy and conditional entropy take only non-negative values.
We introduce a physically motivated framework for defining quantum conditional entropy.
arXiv Detail & Related papers (2022-08-30T17:44:17Z) - Logical Entropy and Negative Probabilities in Quantum Mechanics [0.0]
The concept of Logical Entropy, $S_L = 1- sum_i=1n p_i2$, was introduced by David Ellerman in a series of recent papers.
We show that the logical entropy plays a profound role in establishing the peculiar rules of quantum physics.
arXiv Detail & Related papers (2022-01-12T10:49:43Z) - Shannon theory beyond quantum: information content of a source [68.8204255655161]
We extend the definition of information content to operational probabilistic theories.
We prove relevant properties as the subadditivity, and the relation between purity and information content of a state.
arXiv Detail & Related papers (2021-12-23T16:36:06Z) - The Logic of Quantum Programs [77.34726150561087]
We present a logical calculus for reasoning about information flow in quantum programs.
In particular we introduce a dynamic logic that is capable of dealing with quantum measurements, unitary evolutions and entanglements in compound quantum systems.
arXiv Detail & Related papers (2021-09-14T16:08:37Z) - Quantum logical entropy: fundamentals and general properties [0.0]
We introduce the quantum logical entropy to study quantum systems.
We prove several properties of this entropy for generic density matrices.
We extend the notion of quantum logical entropy to post-selected systems.
arXiv Detail & Related papers (2021-08-05T16:47:22Z) - Catalytic Transformations of Pure Entangled States [62.997667081978825]
Entanglement entropy is the von Neumann entropy of quantum entanglement of pure states.
The relation between entanglement entropy and entanglement distillation has been known only for the setting, and the meaning of entanglement entropy in the single-copy regime has so far remained open.
Our results imply that entanglement entropy quantifies the amount of entanglement available in a bipartite pure state to be used for quantum information processing, giving results an operational meaning also in entangled single-copy setup.
arXiv Detail & Related papers (2021-02-22T16:05:01Z) - Entropy and reversible catalysis [0.0]
I show that non-decreasing entropy provides a necessary and sufficient condition to convert the state of a physical system into a different state.
I show how they can be used to obtain a quantitative single-shot characterization of Gibbs states in quantum statistical mechanics.
arXiv Detail & Related papers (2020-12-10T10:42:44Z) - Pseudo Entropy in Free Quantum Field Theories [0.0]
We conjecture two novel properties of Pseudo entropy which we conjecture to be universal in field theories.
Our numerical results imply that pseudo entropy can play a role as a new quantum order parameter.
arXiv Detail & Related papers (2020-11-19T04:25:18Z) - Shannon Entropy Rate of Hidden Markov Processes [77.34726150561087]
We show how to calculate entropy rates for hidden Markov chains.
We also show how this method gives the minimal set of infinite predictive features.
A sequel addresses the challenge's second part on structure.
arXiv Detail & Related papers (2020-08-29T00:48:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.