Decoherence and Probability
- URL: http://arxiv.org/abs/2410.01317v1
- Date: Wed, 2 Oct 2024 08:16:09 GMT
- Title: Decoherence and Probability
- Authors: Richard Dawid, Karim P. Y. Thébault,
- Abstract summary: Non-probabilistic accounts of the emergence of probability via decoherence are unconvincing.
An alternative account of the emergence of probability involves the combination of textitquasi-probabilistic emergence, via a partially interpreted decoherence model.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: One cannot justifiably presuppose the physical salience of structures derived via decoherence theory based upon an entirely uninterpreted use of the quantum formalism. Non-probabilistic accounts of the emergence of probability via decoherence are thus unconvincing. An alternative account of the emergence of probability involves the combination of \textit{quasi-probabilistic emergence}, via a partially interpreted decoherence model, with \textit{semi-classical emergence}, via averaging of observables with respect to a positive-definite \textit{quasi-probability} function and neglect of terms $O(\hbar)$. This approach avoids well-known issues with constructing classical probability measures in the context of the full set of states of a quantum theory. Rather, it considers a generalised \textit{quasi-measure} structure, \textit{partially interpreted} as weighting of possibilities, over a more general algebra, and delimits the context in which the combination of decoherence and a semi-classical averaging allows us to recover a classical probability model as a coarse-grained description which neglects terms $O(\hbar)$.
Related papers
- (Quantum) Indifferentiability and Pre-Computation [50.06591179629447]
Indifferentiability is a cryptographic paradigm for analyzing the security of ideal objects.
Despite its strength, indifferentiability is not known to offer security against pre-processing attacks.
We propose a strengthening of indifferentiability which is not only composable but also takes arbitrary pre-computation into account.
arXiv Detail & Related papers (2024-10-22T00:41:47Z) - Form of Contextuality Predicting Probabilistic Equivalence between Two Sets of Three Mutually Noncommuting Observables [0.0]
We introduce a contextual quantum system comprising mutually complementary observables organized into two or more collections of pseudocontexts with the same probability sums of outcomes.
These pseudocontexts constitute non-orthogonal bases within the Hilbert space, featuring a state-independent sum of probabilities.
The measurement contextuality in this setup arises from the quantum realizations of the hypergraph, which adhere to a specific bound on the linear combination of probabilities.
arXiv Detail & Related papers (2023-09-22T08:51:34Z) - Connecting classical finite exchangeability to quantum theory [69.62715388742298]
Exchangeability is a fundamental concept in probability theory and statistics.
We show how a de Finetti-like representation theorem for finitely exchangeable sequences requires a mathematical representation which is formally equivalent to quantum theory.
arXiv Detail & Related papers (2023-06-06T17:15:19Z) - Foundations of non-commutative probability theory (Extended abstract) [1.8782750537161614]
Kolmogorov's setting for probability theory is given an original generalization to account for probabilities arising from Quantum Mechanics.
The sample space has a central role in this presentation and random variables, i.e., observables, are defined in a natural way.
arXiv Detail & Related papers (2023-06-01T20:34:01Z) - A Measure-Theoretic Axiomatisation of Causality [55.6970314129444]
We argue in favour of taking Kolmogorov's measure-theoretic axiomatisation of probability as the starting point towards an axiomatisation of causality.
Our proposed framework is rigorously grounded in measure theory, but it also sheds light on long-standing limitations of existing frameworks.
arXiv Detail & Related papers (2023-05-19T13:15:48Z) - A Robustness Analysis of Blind Source Separation [91.3755431537592]
Blind source separation (BSS) aims to recover an unobserved signal from its mixture $X=f(S)$ under the condition that the transformation $f$ is invertible but unknown.
We present a general framework for analysing such violations and quantifying their impact on the blind recovery of $S$ from $X$.
We show that a generic BSS-solution in response to general deviations from its defining structural assumptions can be profitably analysed in the form of explicit continuity guarantees.
arXiv Detail & Related papers (2023-03-17T16:30:51Z) - Non-standard entanglement structure of local unitary self-dual models as
a saturated situation of repeatability in general probabilistic theories [61.12008553173672]
We show the existence of infinite structures of quantum composite system such that it is self-dual with local unitary symmetry.
We also show the existence of a structure of quantum composite system such that non-orthogonal states in the structure are perfectly distinguishable.
arXiv Detail & Related papers (2021-11-29T23:37:58Z) - Varieties of contextuality based on probability and structural
nonembeddability [0.0]
Kochen and Specker's Theorem0 is a demarcation criterion for differentiating between those groups.
Probability contextuality still allows classical models, albeit with nonclassical probabilities.
The logico-algebraic "strong" form of contextuality characterizes collections of quantum observables that have no faithfully embedding into (extended) Boolean algebras.
arXiv Detail & Related papers (2021-03-10T15:04:34Z) - Implicit Regularization in ReLU Networks with the Square Loss [56.70360094597169]
We show that it is impossible to characterize the implicit regularization with the square loss by any explicit function of the model parameters.
Our results suggest that a more general framework may be needed to understand implicit regularization for nonlinear predictors.
arXiv Detail & Related papers (2020-12-09T16:48:03Z) - Solvable Criterion for the Contextuality of any Prepare-and-Measure
Scenario [0.0]
An operationally noncontextual ontological model of the quantum statistics associated with the prepare-and-measure scenario is constructed.
A mathematical criterion, called unit separability, is formulated as the relevant classicality criterion.
We reformulate our results in the framework of generalized probabilistic theories.
arXiv Detail & Related papers (2020-03-13T18:00:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.