Convexity of noncontextual wirings and how they order the set of correlations
- URL: http://arxiv.org/abs/2407.02120v1
- Date: Tue, 2 Jul 2024 10:10:08 GMT
- Title: Convexity of noncontextual wirings and how they order the set of correlations
- Authors: Tiago Santos, Rafael Wagner, Bárbara Amaral,
- Abstract summary: We consider free operations to be noncontextual wirings (NCW)
We prove several elementary facts about how different resources can be converted via NCW.
Our results reveal the intricate ordering induced by NCW in scenarios beyond Bell scenarios.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The resource theory of contextuality considers resourceful objects to be probabilistic data-tables, known as correlations or behaviors, that fail to have an explanation in terms of Kochen-Specker noncontextual models. In this work, we advance this resource theory, considering free operations to be noncontextual wirings (NCW). We show that all such wirings form a convex set. When restricted to Bell scenarios, we show that such wirings are not equivalent to local operations assisted by a common source of classical shared randomness (LOSR). The set of all NCW operations contains LOSR, but is strictly larger. We also prove several elementary facts about how different resources can be converted via NCW. As a concrete example, we show that there are pairs of behaviors that cannot be converted one into the other using NCW. Since resource conversion mathematically induces a pre-order over the set of all behaviors, our results reveal the intricate ordering induced by NCW in scenarios beyond Bell scenarios.
Related papers
- Identifying General Mechanism Shifts in Linear Causal Representations [58.6238439611389]
We consider the linear causal representation learning setting where we observe a linear mixing of $d$ unknown latent factors.
Recent work has shown that it is possible to recover the latent factors as well as the underlying structural causal model over them.
We provide a surprising identifiability result that it is indeed possible, under some very mild standard assumptions, to identify the set of shifted nodes.
arXiv Detail & Related papers (2024-10-31T15:56:50Z) - Training Nonlinear Transformers for Chain-of-Thought Inference: A Theoretical Generalization Analysis [82.51626700527837]
Chain-of-shift (CoT) is an efficient method that enables the reasoning ability of large language models by augmenting the query using examples with multiple intermediate steps.
We show that despite the theoretical success of CoT, it fails to provide an accurate generalization when CoT does.
arXiv Detail & Related papers (2024-10-03T03:12:51Z) - On the Representational Capacity of Neural Language Models with Chain-of-Thought Reasoning [87.73401758641089]
Chain-of-thought (CoT) reasoning has improved the performance of modern language models (LMs)
We show that LMs can represent the same family of distributions over strings as probabilistic Turing machines.
arXiv Detail & Related papers (2024-06-20T10:59:02Z) - Consistency and Causality of Interconnected Nonsignaling Resources [0.0]
This paper examines networks of $n$ measuring parties sharing $m$ independent nonsignaling resources that can be locally wired together.
A specific framework is provided for studying probability distributions arising in such networks.
arXiv Detail & Related papers (2024-05-28T17:50:16Z) - Offline RL with Observation Histories: Analyzing and Improving Sample
Complexity [70.7884839812069]
offline reinforcement learning can synthesize more optimal behavior from a dataset consisting only of suboptimal trials.
We show that standard offline RL algorithms conditioned on observation histories suffer from poor sample complexity.
We propose that offline RL can explicitly optimize this loss to aid worst-case sample complexity.
arXiv Detail & Related papers (2023-10-31T17:29:46Z) - Latent Feature Relation Consistency for Adversarial Robustness [80.24334635105829]
misclassification will occur when deep neural networks predict adversarial examples which add human-imperceptible adversarial noise to natural examples.
We propose textbfLatent textbfFeature textbfRelation textbfConsistency (textbfLFRC)
LFRC constrains the relation of adversarial examples in latent space to be consistent with the natural examples.
arXiv Detail & Related papers (2023-03-29T13:50:01Z) - Quantifying EPR: the resource theory of nonclassicality of common-cause
assemblages [0.0]
An alternative perspective on steering is that Alice has no causal influence on the physical state of Bob's system.
We develop a resource-theoretic treatment of correlations in EPR scenarios.
We show that resource conversion under free operations in this paradigm can be evaluated with a single instance of a semidefinite program.
arXiv Detail & Related papers (2021-11-19T14:25:28Z) - Correlations in entanglement-assisted prepare-and-measure scenarios [0.0]
We investigate the correlations that can arise between Alice and Bob in prepare-and-measure communication scenarios.
We provide examples of correlations that actually require more general protocols based on higher-dimensional states.
arXiv Detail & Related papers (2021-03-19T11:30:39Z) - Context-Specific Likelihood Weighting [0.0]
We introduce context-specific likelihood weighting (CS-LW) for approximate inference.
Unlike the standard likelihood weighting, CS-LW is based on partial assignments of random variables.
We empirically show that CS-LW is competitive with state-of-the-art algorithms for approximate inference.
arXiv Detail & Related papers (2021-01-24T20:23:14Z) - When Is Generalizable Reinforcement Learning Tractable? [74.87383727210705]
We study the query complexity required to train RL agents that can generalize to multiple environments.
We introduce Strong Proximity, a structural condition which precisely characterizes the relative closeness of different environments.
We show that under a natural weakening of this condition, RL can require query complexity that is exponential in the horizon to generalize.
arXiv Detail & Related papers (2021-01-01T19:08:24Z) - Postquantum common-cause channels: the resource theory of local
operations and shared entanglement [0.0]
We define the type-independent resource theory of local operations and shared entanglement (LOSE)
This allows us to formally quantify postquantumness in common-cause scenarios such as the Bell scenario.
We prove several fundamental results regarding how the type of a resource determines what conversions into other resources are possible.
arXiv Detail & Related papers (2020-04-13T18:03:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.