The role of (non)contextuality in Bell's theorems from the perspective
of an operational modeling framework
- URL: http://arxiv.org/abs/2001.09756v3
- Date: Sun, 3 Apr 2022 18:43:21 GMT
- Title: The role of (non)contextuality in Bell's theorems from the perspective
of an operational modeling framework
- Authors: Michael L. Ulrey
- Abstract summary: It is shown that noncontextuality is the most general property of an operational model that blocks replication of QM predictions.
It is shown that the construction of convex hulls of finite ensembles of OD model instances is (mathematically) equivalent to the traditional hidden variables approach.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A novel approach for analyzing "classical" alternatives to quantum mechanics
for explaining the statistical results of an EPRB-like experiment is proposed.
This perspective is top-down instead of bottom-up. Rather than beginning with
an inequality derivation, a hierarchy of model types is constructed, each
distinguished by appropriately parameterized conditional probabilities. This
hierarchy ranks the "classical" model types in terms of their ability to
reproduce QM statistics or not. The analysis goes beyond the usual
consideration of model types that "fall short" (i.e., satisfy all of the CHSH
inequalities) to ones that are "excessive" (i.e., not only violate CHSH but
even exceed a Tsirelson bound). This approach clearly shows that
noncontextuality is the most general property of an operational model that
blocks replication of at least some QM statistical predictions. Factorizability
is naturally revealed to be a special case of noncontextuality. The same is
true for the combination of remote context independence and outcome determinism
(RCI+OD). It is noncontextuality that determines the dividing line between
"classical" model instances that satisfy the CHSH inequalities and those that
don't. Outcome deterministic operational models are revealed to be the
"building blocks" of all the rest, including quantum mechanical, noncontextual,
and contextual ones. The set of noncontextual model instances is exactly the
convex hull of all 16 RCI+OD model instances, and furthermore, the set of all
model instances, including all QM ones, is equal to the convex hull of the 256
OD model instances. It is shown that, under a mild assumption, the construction
of convex hulls of finite ensembles of OD model instances is (mathematically)
equivalent to the traditional hidden variables approach. Plots and figures
provide visual affirmation of many of the results.
Related papers
- Graphical Modelling without Independence Assumptions for Uncentered Data [0.30723404270319693]
We show how the zero-mean assumption can cause egregious errors in modelling.
Specifically, we propose a relaxation of the zero-mean assumption that allows the avoidance of such errors.
arXiv Detail & Related papers (2024-08-05T11:40:23Z) - Structure Learning and Parameter Estimation for Graphical Models via
Penalized Maximum Likelihood Methods [0.0]
In the thesis, we consider two different types of PGMs: Bayesian networks (BNs) which are static, and continuous time Bayesian networks which, as the name suggests, have a temporal component.
We are interested in recovering their true structure, which is the first step in learning any PGM.
arXiv Detail & Related papers (2023-01-30T20:26:13Z) - Bayesian Nonlocal Operator Regression (BNOR): A Data-Driven Learning
Framework of Nonlocal Models with Uncertainty Quantification [4.705624984585247]
We consider the problem of modeling heterogeneous materials where micro-scale dynamics and interactions affect global behavior.
We develop a Bayesian framework for uncertainty (UQ) in material response prediction when using nonlocal models.
This work is a first step towards statistical characterization of nonlocal model discrepancy in the context of homogenization.
arXiv Detail & Related papers (2022-10-06T22:37:59Z) - Super-model ecosystem: A domain-adaptation perspective [101.76769818069072]
This paper attempts to establish the theoretical foundation for the emerging super-model paradigm via domain adaptation.
Super-model paradigms help reduce computational and data cost and carbon emission, which is critical to AI industry.
arXiv Detail & Related papers (2022-08-30T09:09:43Z) - On the Strong Correlation Between Model Invariance and Generalization [54.812786542023325]
Generalization captures a model's ability to classify unseen data.
Invariance measures consistency of model predictions on transformations of the data.
From a dataset-centric view, we find a certain model's accuracy and invariance linearly correlated on different test sets.
arXiv Detail & Related papers (2022-07-14T17:08:25Z) - Estimation of Bivariate Structural Causal Models by Variational Gaussian
Process Regression Under Likelihoods Parametrised by Normalising Flows [74.85071867225533]
Causal mechanisms can be described by structural causal models.
One major drawback of state-of-the-art artificial intelligence is its lack of explainability.
arXiv Detail & Related papers (2021-09-06T14:52:58Z) - Experimentally adjudicating between different causal accounts of Bell
inequality violations via statistical model selection [0.0]
Bell inequalities follow from a set of seemingly natural assumptions about how to provide a causal model of a Bell experiment.
Two types of causal models that modify some of these assumptions have been proposed.
We seek to adjudicate between these alternatives based on their predictive power.
arXiv Detail & Related papers (2021-07-30T19:33:02Z) - A Twin Neural Model for Uplift [59.38563723706796]
Uplift is a particular case of conditional treatment effect modeling.
We propose a new loss function defined by leveraging a connection with the Bayesian interpretation of the relative risk.
We show our proposed method is competitive with the state-of-the-art in simulation setting and on real data from large scale randomized experiments.
arXiv Detail & Related papers (2021-05-11T16:02:39Z) - Causal Expectation-Maximisation [70.45873402967297]
We show that causal inference is NP-hard even in models characterised by polytree-shaped graphs.
We introduce the causal EM algorithm to reconstruct the uncertainty about the latent variables from data about categorical manifest variables.
We argue that there appears to be an unnoticed limitation to the trending idea that counterfactual bounds can often be computed without knowledge of the structural equations.
arXiv Detail & Related papers (2020-11-04T10:25:13Z) - Contextuality scenarios arising from networks of stochastic processes [68.8204255655161]
An empirical model is said contextual if its distributions cannot be obtained marginalizing a joint distribution over X.
We present a different and classical source of contextual empirical models: the interaction among many processes.
The statistical behavior of the network in the long run makes the empirical model generically contextual and even strongly contextual.
arXiv Detail & Related papers (2020-06-22T16:57:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.