Complexity of Contextuality
- URL: http://arxiv.org/abs/2506.09133v1
- Date: Tue, 10 Jun 2025 18:00:09 GMT
- Title: Complexity of Contextuality
- Authors: Theodoros Yianni, Farid Shahandeh,
- Abstract summary: Generalized contextuality is a hallmark of nonclassical theories like quantum mechanics.<n>We find that the complexity of deciding the existence of a noncontextual ontological model of dimension $k$ is at least exponential in the dimension of the theory.<n>We also demonstrate the fundamental difference between finding the smallest noncontextual ontological model and the smallest ontological model.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generalized contextuality is a hallmark of nonclassical theories like quantum mechanics. Yet, three fundamental computational problems concerning its decidability and complexity remain open. First, determining the complexity of deciding if a theory admits a noncontextual ontological model; Second, determining the complexity of deciding if such a model is possible for a specific dimension $k$; Third, efficiently computing the smallest such model when it exists, given that finding the smallest ontological model is NP-hard. We address the second problem by presenting an algorithm derived from a geometric formulation and its reduction to the intermediate simplex problem in computational geometry. We find that the complexity of deciding the existence of a noncontextual ontological model of dimension $k$ is at least exponential in the dimension of the theory and at most exponential in $k$. This, in turn, implies that computing the smallest noncontextual ontological model is inefficient in general. Finally, we demonstrate the fundamental difference between finding the smallest noncontextual ontological model and the smallest ontological model using an explicit example wherein the respective minimum ontic sizes are five and four.
Related papers
- A Fine-Grained Complexity View on Propositional Abduction -- Algorithms and Lower Bounds [6.6362553223890535]
We analyze the complexity of intractable abduction problems under the seemingly overlooked but natural parameter n.<n>We obtain several positive results for $SigmaP$ as well as NP- and coNP-complete fragments.<n>We complement this with lower bounds and for many fragments rule out improvements under the (strong) exponential-time hypothesis.
arXiv Detail & Related papers (2025-05-15T11:56:19Z) - The Limits of AI Explainability: An Algorithmic Information Theory Approach [4.759142872591625]
This paper establishes a theoretical foundation for understanding the fundamental limits of AI explainability through algorithmic information theory.<n>We formalize explainability as the approximation of complex models by simpler ones, quantifying both approximation error and explanation using Kolmogorov.<n>Results highlight considerations likely to be relevant to the design, evaluation, and oversight of explainable AI systems.
arXiv Detail & Related papers (2025-04-29T11:58:37Z) - Cover Learning for Large-Scale Topology Representation [0.0]
We describe a method for learning topologically-faithful covers of geometric datasets.<n>We show that the simplicial complexes thus obtained can outperform standard topological inference approaches in terms of size.
arXiv Detail & Related papers (2025-03-12T19:10:20Z) - Topological Representational Similarity Analysis in Brains and Beyond [15.417809900388262]
This thesis introduces Topological RSA (tRSA), a novel framework combining geometric and topological properties of neural representations.
tRSA applies nonlinear monotonic transforms to representational dissimilarities, emphasizing local topology while retaining intermediate-scale geometry.
The resulting geo-topological matrices enable model comparisons robust to noise and individual idiosyncrasies.
arXiv Detail & Related papers (2024-08-21T19:02:00Z) - Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Optimal Multi-Distribution Learning [88.3008613028333]
Multi-distribution learning seeks to learn a shared model that minimizes the worst-case risk across $k$ distinct data distributions.
We propose a novel algorithm that yields an varepsilon-optimal randomized hypothesis with a sample complexity on the order of (d+k)/varepsilon2.
arXiv Detail & Related papers (2023-12-08T16:06:29Z) - Finding Alignments Between Interpretable Causal Variables and
Distributed Neural Representations [62.65877150123775]
Causal abstraction is a promising theoretical framework for explainable artificial intelligence.
Existing causal abstraction methods require a brute-force search over alignments between the high-level model and the low-level one.
We present distributed alignment search (DAS), which overcomes these limitations.
arXiv Detail & Related papers (2023-03-05T00:57:49Z) - Structure Learning and Parameter Estimation for Graphical Models via
Penalized Maximum Likelihood Methods [0.0]
In the thesis, we consider two different types of PGMs: Bayesian networks (BNs) which are static, and continuous time Bayesian networks which, as the name suggests, have a temporal component.
We are interested in recovering their true structure, which is the first step in learning any PGM.
arXiv Detail & Related papers (2023-01-30T20:26:13Z) - MetaLogic: Logical Reasoning Explanations with Fine-Grained Structure [129.8481568648651]
We propose a benchmark to investigate models' logical reasoning capabilities in complex real-life scenarios.
Based on the multi-hop chain of reasoning, the explanation form includes three main components.
We evaluate the current best models' performance on this new explanation form.
arXiv Detail & Related papers (2022-10-22T16:01:13Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - The Separation Capacity of Random Neural Networks [78.25060223808936]
We show that a sufficiently large two-layer ReLU-network with standard Gaussian weights and uniformly distributed biases can solve this problem with high probability.
We quantify the relevant structure of the data in terms of a novel notion of mutual complexity.
arXiv Detail & Related papers (2021-07-31T10:25:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.