Reexamination of the Kochen-Specker theorem: Relaxation of the
completeness assumption
- URL: http://arxiv.org/abs/2210.06822v2
- Date: Tue, 11 Jul 2023 05:01:20 GMT
- Title: Reexamination of the Kochen-Specker theorem: Relaxation of the
completeness assumption
- Authors: Kelvin Onggadinata, Dagomir Kaszlikowski, Pawel Kurzynski
- Abstract summary: The Kochen-Specker theorem states that exclusive and complete deterministic outcome assignments are impossible for certain sets of measurements.
We show it is possible to construct a joint quasiprobability distribution over any KS set by relaxing the completeness assumption.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Kochen-Specker theorem states that exclusive and complete deterministic
outcome assignments are impossible for certain sets of measurements, called
Kochen-Specker (KS) sets. A straightforward consequence is that KS sets do not
have joint probability distributions because no set of joint outcomes over such
a distribution can be constructed. However, we show it is possible to construct
a joint quasiprobability distribution over any KS set by relaxing the
completeness assumption. Interestingly, completeness is still observable at the
level of measurable marginal probability distributions. This suggests the
observable completeness might not be a fundamental feature, but a secondary
property.
Related papers
- The boundary of Kirkwood-Dirac quasiprobability [0.759660604072964]
The Kirkwood-Dirac quasiprobability describes measurement statistics of joint quantum observables.
We introduce the postquantum quasiprobability under mild assumptions to provide an outer boundary for KD quasiprobability.
Surprisingly, we are able to derive some nontrivial bounds valid for both classical probability and KD quasiprobability.
arXiv Detail & Related papers (2025-04-12T14:23:36Z) - Decoherence and Probability [0.0]
Non-probabilistic accounts of the emergence of probability via decoherence are unconvincing.
Our analysis delimits the context in which the combination of decoherence and a semi-classical averaging allows us to recover a classical probability model.
arXiv Detail & Related papers (2024-10-02T08:16:09Z) - Particle approximations of Wigner distributions for n arbitrary observables [0.0]
A class of signed joint probability measures for n arbitrary quantum observables is derived and studied.
It is shown that the Wigner distribution associated with these observables can be rigorously approximated by such measures.
arXiv Detail & Related papers (2024-09-28T01:42:57Z) - Maximal Non-Kochen-Specker Sets and a Lower Bound on the Size of
Kochen-Specker Sets [1.5163329671980246]
A Kochen-Specker (KS) set is a finite collection of vectors on the two-sphere containing no antipodal pairs.
The existence of KS sets lies at the heart of Kochen and Specker's argument against non-contextual hidden variable theories.
arXiv Detail & Related papers (2024-03-08T11:38:16Z) - Double or nothing: a Kolmogorov extension theorem for multitime (bi)probabilities in quantum mechanics [0.0]
We prove a generalization of the Kolmogorov extension theorem that applies to families of complex-valued bi-probability distributions.
We discuss the relation of our results with the quantum comb formalism.
arXiv Detail & Related papers (2024-02-02T08:40:03Z) - Connecting classical finite exchangeability to quantum theory [69.62715388742298]
Exchangeability is a fundamental concept in probability theory and statistics.
We show how a de Finetti-like representation theorem for finitely exchangeable sequences requires a mathematical representation which is formally equivalent to quantum theory.
arXiv Detail & Related papers (2023-06-06T17:15:19Z) - Foundations of non-commutative probability theory (Extended abstract) [1.8782750537161614]
Kolmogorov's setting for probability theory is given an original generalization to account for probabilities arising from Quantum Mechanics.
The sample space has a central role in this presentation and random variables, i.e., observables, are defined in a natural way.
arXiv Detail & Related papers (2023-06-01T20:34:01Z) - A Measure-Theoretic Axiomatisation of Causality [55.6970314129444]
We argue in favour of taking Kolmogorov's measure-theoretic axiomatisation of probability as the starting point towards an axiomatisation of causality.
Our proposed framework is rigorously grounded in measure theory, but it also sheds light on long-standing limitations of existing frameworks.
arXiv Detail & Related papers (2023-05-19T13:15:48Z) - Quantum de Finetti Theorems as Categorical Limits, and Limits of State
Spaces of C*-algebras [0.0]
We show that quantum de Finetti construction has a universal property as a categorical limit.
This allows us to pass canonically between categorical treatments of finite dimensional quantum theory and the infinite dimensional.
We also show that the same categorical analysis also justifies a continuous de Finetti theorem for classical probability.
arXiv Detail & Related papers (2022-07-12T20:51:23Z) - Distribution-free binary classification: prediction sets, confidence
intervals and calibration [106.50279469344937]
We study three notions of uncertainty quantification -- calibration, confidence intervals and prediction sets -- for binary classification in the distribution-free setting.
We derive confidence intervals for binned probabilities for both fixed-width and uniform-mass binning.
As a consequence of our 'tripod' theorems, these confidence intervals for binned probabilities lead to distribution-free calibration.
arXiv Detail & Related papers (2020-06-18T14:17:29Z) - Metrizing Weak Convergence with Maximum Mean Discrepancies [88.54422104669078]
This paper characterizes the maximum mean discrepancies (MMD) that metrize the weak convergence of probability measures for a wide class of kernels.
We prove that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel k, metrizes the weak convergence of probability measures if and only if k is continuous.
arXiv Detail & Related papers (2020-06-16T15:49:33Z) - Uncertainty quantification for nonconvex tensor completion: Confidence
intervals, heteroscedasticity and optimality [92.35257908210316]
We study the problem of estimating a low-rank tensor given incomplete and corrupted observations.
We find that it attains unimprovable rates $ell-2$ accuracy.
arXiv Detail & Related papers (2020-06-15T17:47:13Z) - Finite Block Length Analysis on Quantum Coherence Distillation and
Incoherent Randomness Extraction [64.04327674866464]
We introduce a variant of randomness extraction framework where free incoherent operations are allowed before the incoherent measurement.
We show that the maximum number of random bits extractable from a given quantum state is precisely equal to the maximum number of coherent bits that can be distilled from the same state.
Remarkably, the incoherent operation classes all admit the same second order expansions.
arXiv Detail & Related papers (2020-02-27T09:48:52Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.