General probabilistic theories: An introduction
- URL: http://arxiv.org/abs/2103.07469v2
- Date: Mon, 23 Aug 2021 10:58:09 GMT
- Title: General probabilistic theories: An introduction
- Authors: Martin Pl\'avala
- Abstract summary: We provide in-depth explanations of the basic concepts and elements of the framework of GPTs.
The review is self-contained and it is meant to provide the reader with consistent introduction to GPTs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce the framework of general probabilistic theories (GPTs for
short). GPTs are a class of operational theories that generalize both
finite-dimensional classical and quantum theory, but they also include other,
more exotic theories, such as the boxworld theory containing Popescu-Rohrlich
boxes. We provide in-depth explanations of the basic concepts and elements of
the framework of GPTs, and we also prove several well-known results. The review
is self-contained and it is meant to provide the reader with consistent
introduction to GPTs. Our tools mainly include convex geometry, but we also
introduce diagrammatic notation and we often express equations via diagrams.
Related papers
- Machine Learning of the Prime Distribution [49.84018914962972]
We provide a theoretical argument explaining the experimental observations of Yang-Hui He about the learnability of primes.
We also posit that the ErdHos-Kac law would very unlikely be discovered by current machine learning techniques.
arXiv Detail & Related papers (2024-03-19T09:47:54Z) - Optimal CHSH values for regular polygon theories in generalized
probabilistic theories [0.0]
In the usual CHSH setting for quantum theory, the CHSH value is known to be optimized by maximally entangled states.
Our result gives a physical meaning to the concept of maximal entanglement" in regular polygon theories.
arXiv Detail & Related papers (2024-01-09T14:59:26Z) - Derivation of Standard Quantum Theory via State Discrimination [53.64687146666141]
General Probabilistic Theories (GPTs) is a new information theoretical approach to single out standard quantum theory.
We focus on the bound of the performance for an information task called state discrimination in general models.
We characterize standard quantum theory out of general models in GPTs by the bound of the performance for state discrimination.
arXiv Detail & Related papers (2023-07-21T00:02:11Z) - Connecting classical finite exchangeability to quantum theory [69.62715388742298]
Exchangeability is a fundamental concept in probability theory and statistics.
We show how a de Finetti-like representation theorem for finitely exchangeable sequences requires a mathematical representation which is formally equivalent to quantum theory.
arXiv Detail & Related papers (2023-06-06T17:15:19Z) - Accessible fragments of generalized probabilistic theories, cone equivalence, and applications to witnessing nonclassicality [0.7421845364041001]
We consider the question of how to provide a GPT-like characterization of a particular experimental setup within a given physical theory.
We show that the resulting characterization is not generally a GPT in and of itself-rather, it is described by a more general mathematical object.
We prove that neither incompatibility among measurements nor the assumption of freedom of choice is necessary for witnessing failures of generalized noncontextuality.
arXiv Detail & Related papers (2021-12-08T19:00:23Z) - Entanglement and superposition are equivalent concepts in any physical
theory [6.76734184727575]
We prove that any two general probabilistic theories (GPTs) are entangleable.
We show that all non-classical GPTs exhibit a strong form of incompatibility of states and measurements.
arXiv Detail & Related papers (2021-09-09T17:44:11Z) - A consolidating review of Spekkens' toy theory [0.0]
Spekkens' toy theory is based off a simple premise: "What if we took a common classical theory and added the uncertainty principle as a postulate?"
We consolidate different approaches to Spekkens' toy theory, including the stabilizer formalism and the generalization to arbitrary dimensions.
arXiv Detail & Related papers (2021-05-07T14:10:01Z) - Forecasting: theory and practice [65.71277206849244]
This article provides a non-systematic review of the theory and the practice of forecasting.
We provide an overview of a wide range of theoretical, state-of-the-art models, methods, principles, and approaches.
We then demonstrate how such theoretical concepts are applied in a variety of real-life contexts.
arXiv Detail & Related papers (2020-12-04T16:56:44Z) - Probabilistic Theories and Reconstructions of Quantum Theory (Les
Houches 2019 lecture notes) [0.0]
These lecture notes provide a basic introduction to the framework of generalized probabilistic theories (GPTs)
I present two conceivable phenomena beyond quantum: superstrong nonlocality and higher-order interference.
I summarize a reconstruction of quantum theory from the principles of Tomographic Locality, Continuous Reversibility, and the Subspace Axiom.
arXiv Detail & Related papers (2020-11-02T20:03:13Z) - Topological Quantum Gravity of the Ricci Flow [62.997667081978825]
We present a family of topological quantum gravity theories associated with the geometric theory of the Ricci flow.
First, we use BRST quantization to construct a "primitive" topological Lifshitz-type theory for only the spatial metric.
We extend the primitive theory by gauging foliation-preserving spacetime symmetries.
arXiv Detail & Related papers (2020-10-29T06:15:30Z) - From a quantum theory to a classical one [117.44028458220427]
We present and discuss a formal approach for describing the quantum to classical crossover.
The method was originally introduced by L. Yaffe in 1982 for tackling large-$N$ quantum field theories.
arXiv Detail & Related papers (2020-04-01T09:16:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.