Decoherence and Probability
- URL: http://arxiv.org/abs/2410.01317v2
- Date: Sun, 26 Jan 2025 12:01:27 GMT
- Title: Decoherence and Probability
- Authors: Richard Dawid, Karim P. Y. Thébault,
- Abstract summary: Non-probabilistic accounts of the emergence of probability via decoherence are unconvincing.
Our analysis delimits the context in which the combination of decoherence and a semi-classical averaging allows us to recover a classical probability model.
- Score: 0.0
- License:
- Abstract: One cannot justifiably presuppose the physical salience of structures derived via decoherence theory based upon an entirely uninterpreted use of the quantum formalism. Non-probabilistic accounts of the emergence of probability via decoherence are unconvincing. An alternative account of the emergence of probability involves the combination of a partially interpreted decoherence model and an averaging of observables with respect to a positive-definite quasi-probability function and neglect of terms $O(\hbar)$. Our analysis delimits the context in which the combination of decoherence and a semi-classical averaging allows us to recover a classical probability model within an emergent coarse-grained description.
Related papers
- (Quantum) Indifferentiability and Pre-Computation [50.06591179629447]
Indifferentiability is a cryptographic paradigm for analyzing the security of ideal objects.
Despite its strength, indifferentiability is not known to offer security against pre-processing attacks.
We propose a strengthening of indifferentiability which is not only composable but also takes arbitrary pre-computation into account.
arXiv Detail & Related papers (2024-10-22T00:41:47Z) - Probability vector representation of the Schrödinger equation and Leggett-Garg-type experiments [0.0]
Leggett-Garg inequalities place bounds on the temporal correlations of a system based on the principles of macroscopic realism.
We propose a scheme to describe the dynamics of generic $N$-level quantum systems via a probability vector representation of the Schr"odinger equation.
We also define a precise notion of no-signaling in time (NSIT) for the probability distributions of noncommuting observables.
arXiv Detail & Related papers (2023-12-26T19:00:00Z) - Form of Contextuality Predicting Probabilistic Equivalence between Two Sets of Three Mutually Noncommuting Observables [0.0]
We introduce a contextual quantum system comprising mutually complementary observables organized into two or more collections of pseudocontexts with the same probability sums of outcomes.
These pseudocontexts constitute non-orthogonal bases within the Hilbert space, featuring a state-independent sum of probabilities.
The measurement contextuality in this setup arises from the quantum realizations of the hypergraph, which adhere to a specific bound on the linear combination of probabilities.
arXiv Detail & Related papers (2023-09-22T08:51:34Z) - Foundations of non-commutative probability theory (Extended abstract) [1.8782750537161614]
Kolmogorov's setting for probability theory is given an original generalization to account for probabilities arising from Quantum Mechanics.
The sample space has a central role in this presentation and random variables, i.e., observables, are defined in a natural way.
arXiv Detail & Related papers (2023-06-01T20:34:01Z) - Probabilistic computation and uncertainty quantification with emerging
covariance [11.79594512851008]
Building robust, interpretable, and secure AI system requires quantifying and representing uncertainty under a probabilistic perspective.
Probability computation presents significant challenges for most conventional artificial neural network.
arXiv Detail & Related papers (2023-05-30T17:55:29Z) - A Measure-Theoretic Axiomatisation of Causality [55.6970314129444]
We argue in favour of taking Kolmogorov's measure-theoretic axiomatisation of probability as the starting point towards an axiomatisation of causality.
Our proposed framework is rigorously grounded in measure theory, but it also sheds light on long-standing limitations of existing frameworks.
arXiv Detail & Related papers (2023-05-19T13:15:48Z) - A Robustness Analysis of Blind Source Separation [91.3755431537592]
Blind source separation (BSS) aims to recover an unobserved signal from its mixture $X=f(S)$ under the condition that the transformation $f$ is invertible but unknown.
We present a general framework for analysing such violations and quantifying their impact on the blind recovery of $S$ from $X$.
We show that a generic BSS-solution in response to general deviations from its defining structural assumptions can be profitably analysed in the form of explicit continuity guarantees.
arXiv Detail & Related papers (2023-03-17T16:30:51Z) - Discovering Latent Causal Variables via Mechanism Sparsity: A New
Principle for Nonlinear ICA [81.4991350761909]
Independent component analysis (ICA) refers to an ensemble of methods which formalize this goal and provide estimation procedure for practical application.
We show that the latent variables can be recovered up to a permutation if one regularizes the latent mechanisms to be sparse.
arXiv Detail & Related papers (2021-07-21T14:22:14Z) - Varieties of contextuality based on probability and structural
nonembeddability [0.0]
Kochen and Specker's Theorem0 is a demarcation criterion for differentiating between those groups.
Probability contextuality still allows classical models, albeit with nonclassical probabilities.
The logico-algebraic "strong" form of contextuality characterizes collections of quantum observables that have no faithfully embedding into (extended) Boolean algebras.
arXiv Detail & Related papers (2021-03-10T15:04:34Z) - Contextuality scenarios arising from networks of stochastic processes [68.8204255655161]
An empirical model is said contextual if its distributions cannot be obtained marginalizing a joint distribution over X.
We present a different and classical source of contextual empirical models: the interaction among many processes.
The statistical behavior of the network in the long run makes the empirical model generically contextual and even strongly contextual.
arXiv Detail & Related papers (2020-06-22T16:57:52Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.