Accessible fragments of generalized probabilistic theories, cone equivalence, and applications to witnessing nonclassicality
- URL: http://arxiv.org/abs/2112.04521v3
- Date: Thu, 4 Apr 2024 09:57:20 GMT
- Title: Accessible fragments of generalized probabilistic theories, cone equivalence, and applications to witnessing nonclassicality
- Authors: John H. Selby, David Schmid, Elie Wolfe, Ana Belén Sainz, Ravi Kunjwal, Robert W. Spekkens,
- Abstract summary: We consider the question of how to provide a GPT-like characterization of a particular experimental setup within a given physical theory.
We show that the resulting characterization is not generally a GPT in and of itself-rather, it is described by a more general mathematical object.
We prove that neither incompatibility among measurements nor the assumption of freedom of choice is necessary for witnessing failures of generalized noncontextuality.
- Score: 0.7421845364041001
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The formalism of generalized probabilistic theories (GPTs) was originally developed as a way to characterize the landscape of conceivable physical theories. Thus, the GPT describing a given physical theory necessarily includes all physically possible processes. We here consider the question of how to provide a GPT-like characterization of a particular experimental setup within a given physical theory. We show that the resulting characterization is not generally a GPT in and of itself-rather, it is described by a more general mathematical object that we introduce and term an accessible GPT fragment. We then introduce an equivalence relation, termed cone equivalence, between accessible GPT fragments (and, as a special case, between standard GPTs). We give a number of examples of experimental scenarios that are best described using accessible GPT fragments, and where moreover cone-equivalence arises naturally. We then prove that an accessible GPT fragment admits of a classical explanation if and only if every other fragment that is cone-equivalent to it also admits of a classical explanation. Finally, we leverage this result to prove several fundamental results regarding the experimental requirements for witnessing the failure of generalized noncontextuality. In particular, we prove that neither incompatibility among measurements nor the assumption of freedom of choice is necessary for witnessing failures of generalized noncontextuality, and, moreover, that such failures can be witnessed even using arbitrarily inefficient detectors.
Related papers
- Shadows and subsystems of generalized probabilistic theories: when tomographic incompleteness is not a loophole for contextuality proofs [0.0]
We show that proofs of the failure of noncontextuality are robust to a very broad class of failures of tomographic completeness.
We also introduce the notion of a shadow of a GPT fragment, which captures the information lost when one's states and effects are unknowingly not tomographic for one another.
arXiv Detail & Related papers (2024-09-19T18:00:42Z) - Towards Demystifying the Generalization Behaviors When Neural Collapse
Emerges [132.62934175555145]
Neural Collapse (NC) is a well-known phenomenon of deep neural networks in the terminal phase of training (TPT)
We propose a theoretical explanation for why continuing training can still lead to accuracy improvement on test set, even after the train accuracy has reached 100%.
We refer to this newly discovered property as "non-conservative generalization"
arXiv Detail & Related papers (2023-10-12T14:29:02Z) - Derivation of Standard Quantum Theory via State Discrimination [53.64687146666141]
General Probabilistic Theories (GPTs) is a new information theoretical approach to single out standard quantum theory.
We focus on the bound of the performance for an information task called state discrimination in general models.
We characterize standard quantum theory out of general models in GPTs by the bound of the performance for state discrimination.
arXiv Detail & Related papers (2023-07-21T00:02:11Z) - Incompatibility of observables, channels and instruments in information
theories [68.8204255655161]
We study the notion of compatibility for tests of an operational probabilistic theory.
We show that a theory admits of incompatible tests if and only if some information cannot be extracted without disturbance.
arXiv Detail & Related papers (2022-04-17T08:44:29Z) - Entanglement and superposition are equivalent concepts in any physical
theory [6.76734184727575]
We prove that any two general probabilistic theories (GPTs) are entangleable.
We show that all non-classical GPTs exhibit a strong form of incompatibility of states and measurements.
arXiv Detail & Related papers (2021-09-09T17:44:11Z) - Experimentally bounding deviations from quantum theory for a photonic
three-level system using theory-agnostic tomography [0.0]
We implement GPT tomography on a three-level system corresponding to a single photon shared among three modes.
This scheme achieves a GPT characterization of each of the preparations and measurements implemented in the experiment without requiring any prior characterization of either.
arXiv Detail & Related papers (2021-08-12T17:21:56Z) - Discovering Latent Causal Variables via Mechanism Sparsity: A New
Principle for Nonlinear ICA [81.4991350761909]
Independent component analysis (ICA) refers to an ensemble of methods which formalize this goal and provide estimation procedure for practical application.
We show that the latent variables can be recovered up to a permutation if one regularizes the latent mechanisms to be sparse.
arXiv Detail & Related papers (2021-07-21T14:22:14Z) - General probabilistic theories: An introduction [0.0]
We provide in-depth explanations of the basic concepts and elements of the framework of GPTs.
The review is self-contained and it is meant to provide the reader with consistent introduction to GPTs.
arXiv Detail & Related papers (2021-03-12T12:42:02Z) - General Probabilistic Theories with a Gleason-type Theorem [0.0]
Gleason-type theorems for quantum theory allow one to recover the quantum state space.
We identify the class of general probabilistic theories which also admit Gleason-type theorems.
arXiv Detail & Related papers (2020-05-28T17:29:29Z) - Generalized Sliced Distances for Probability Distributions [47.543990188697734]
We introduce a broad family of probability metrics, coined as Generalized Sliced Probability Metrics (GSPMs)
GSPMs are rooted in the generalized Radon transform and come with a unique geometric interpretation.
We consider GSPM-based gradient flows for generative modeling applications and show that under mild assumptions, the gradient flow converges to the global optimum.
arXiv Detail & Related papers (2020-02-28T04:18:00Z) - Generalised Lipschitz Regularisation Equals Distributional Robustness [47.44261811369141]
We give a very general equality result regarding the relationship between distributional robustness and regularisation.
We show a new result explicating the connection between adversarial learning and distributional robustness.
arXiv Detail & Related papers (2020-02-11T04:19:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.