Contextuality Can be Verified with Noncontextual Experiments
- URL: http://arxiv.org/abs/2412.00199v1
- Date: Fri, 29 Nov 2024 19:01:03 GMT
- Title: Contextuality Can be Verified with Noncontextual Experiments
- Authors: Jonathan J. Thio, Wilfred Salmon, Crispin H. W. Barnes, Stephan De Bièvre, David R. M. Arvidsson-Shukur,
- Abstract summary: Quantum states can be represented by KD distributions, which take values in the complex unit disc.<n>A KD distribution can be measured by a series of weak and projective measurements.<n>We analyze this connection with respect to mixed KD-positive states that cannot be decomposed as convex combinations of pure KD-positive states.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We uncover new features of generalized contextuality by connecting it to the Kirkwood-Dirac (KD) quasiprobability distribution. Quantum states can be represented by KD distributions, which take values in the complex unit disc. Only for ``KD-positive'' states are the KD distributions joint probability distributions. A KD distribution can be measured by a series of weak and projective measurements. We design such an experiment and show that it is contextual iff the underlying state is not KD-positive. We analyze this connection with respect to mixed KD-positive states that cannot be decomposed as convex combinations of pure KD-positive states. Our result is the construction of a noncontextual experiment that enables an experimenter to verify contextuality.
Related papers
- A Dual-Space Framework for General Knowledge Distillation of Large Language Models [98.73585104789217]
Knowledge distillation (KD) is a promising solution to compress large language models (LLMs) by transferring their knowledge to smaller models.
The current white-box KD framework exhibits two limitations.
We propose a dual-space knowledge distillation (DSKD) framework that unifies the prediction heads of the teacher and the student models for KD.
arXiv Detail & Related papers (2025-04-15T17:38:47Z) - The boundary of Kirkwood-Dirac quasiprobability [0.759660604072964]
The Kirkwood-Dirac quasiprobability describes measurement statistics of joint quantum observables.
We introduce the postquantum quasiprobability under mild assumptions to provide an outer boundary for KD quasiprobability.
Surprisingly, we are able to derive some nontrivial bounds valid for both classical probability and KD quasiprobability.
arXiv Detail & Related papers (2025-04-12T14:23:36Z) - Counterfactual Realizability [52.85109506684737]
We introduce a formal definition of realizability, the ability to draw samples from a distribution, and then develop a complete algorithm to determine whether an arbitrary counterfactual distribution is realizable.
We illustrate the implications of this new framework for counterfactual data collection using motivating examples from causal fairness and causal reinforcement learning.
arXiv Detail & Related papers (2025-03-14T20:54:27Z) - Structure, Positivity and Classical Simulability of Kirkwood-Dirac Distributions [0.0]
We study the evolution of the Kirkwood-Dirac quasiprobability distribution.
We identify bounds for pure KD positive states in distributions defined on mutually unbiased bases.
We show that the discrete Fourier transform of KD distributions on qudits in the Fourier basis follows a self-similarity constraint.
arXiv Detail & Related papers (2025-02-17T13:20:10Z) - Convex roofs witnessing Kirkwood-Dirac nonpositivity [0.0]
We construct two witnesses for KD nonpositivity for general mixed states.
Our first witness is the convex roof of the support uncertainty.
Our other witness is the convex roof of the total KD nonpositivity.
arXiv Detail & Related papers (2024-07-05T14:47:32Z) - Properties and Applications of the Kirkwood-Dirac Distribution [0.0]
The KD distribution is a powerful quasi-probability distribution for analysing quantum mechanics.
It can represent a quantum state in terms of arbitrary observables.
This paper reviews the KD distribution, in three parts.
arXiv Detail & Related papers (2024-03-27T18:00:02Z) - Characterizing the geometry of the Kirkwood-Dirac positive states [0.0]
The Kirkwood-Dirac (KD) quasiprobability distribution can describe any quantum state with respect to the eigenbases of two observables $A$ and $B$.
We show how the full convex set of states with positive KD distributions depends on the eigenbases of $A$ and $B$.
We also investigate if there can exist mixed KD-positive states that cannot be written as convex combinations of pure KD-positive states.
arXiv Detail & Related papers (2023-05-31T18:05:02Z) - On student-teacher deviations in distillation: does it pay to disobey? [54.908344098305804]
Knowledge distillation has been widely used to improve the test accuracy of a "student" network.
Despite being trained to fit the teacher's probabilities, the student may not only significantly deviate from the teacher probabilities, but may also outdo the teacher in performance.
arXiv Detail & Related papers (2023-01-30T14:25:02Z) - Controlling Moments with Kernel Stein Discrepancies [74.82363458321939]
Kernel Stein discrepancies (KSDs) measure the quality of a distributional approximation.
We first show that standard KSDs used for weak convergence control fail to control moment convergence.
We then provide sufficient conditions under which alternative diffusion KSDs control both moment and weak convergence.
arXiv Detail & Related papers (2022-11-10T08:24:52Z) - Kirkwood-Dirac classical pure states [0.32634122554914]
A quantum state is called KD classical if its KD distribution is a probability distribution.
We provide some characterizations for the general structure of KD classical pure states.
arXiv Detail & Related papers (2022-10-06T12:58:33Z) - KDExplainer: A Task-oriented Attention Model for Explaining Knowledge
Distillation [59.061835562314066]
We introduce a novel task-oriented attention model, termed as KDExplainer, to shed light on the working mechanism underlying the vanilla KD.
We also introduce a portable tool, dubbed as virtual attention module (VAM), that can be seamlessly integrated with various deep neural networks (DNNs) to enhance their performance under KD.
arXiv Detail & Related papers (2021-05-10T08:15:26Z) - The Hidden Uncertainty in a Neural Networks Activations [105.4223982696279]
The distribution of a neural network's latent representations has been successfully used to detect out-of-distribution (OOD) data.
This work investigates whether this distribution correlates with a model's epistemic uncertainty, thus indicating its ability to generalise to novel inputs.
arXiv Detail & Related papers (2020-12-05T17:30:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.