Probabilistic and Causal Satisfiability: Constraining the Model
- URL: http://arxiv.org/abs/2504.19944v1
- Date: Mon, 28 Apr 2025 16:14:06 GMT
- Title: Probabilistic and Causal Satisfiability: Constraining the Model
- Authors: Markus Bläser, Julian Dörfler, Maciej Liśkiewicz, Benito van der Zander,
- Abstract summary: We study the complexity of satisfiability problems in probabilistic and causal reasoning.<n>The basic terms are probabilities of propositional formulas over atomic events.<n>We extend this line of work by adding two new dimensions to the problem by constraining the models.
- Score: 0.49399484784577985
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the complexity of satisfiability problems in probabilistic and causal reasoning. Given random variables $X_1, X_2,\ldots$ over finite domains, the basic terms are probabilities of propositional formulas over atomic events $X_i = x_i$, such as $P(X_1 = x_1)$ or $P(X_1 = x_1 \vee X_2 = x_2)$. The basic terms can be combined using addition (yielding linear terms) or multiplication (polynomial terms). The probabilistic satisfiability problem asks whether a joint probability distribution satisfies a Boolean combination of (in)equalities over such terms. Fagin et al. (1990) showed that for basic and linear terms, this problem is NP-complete, making it no harder than Boolean satisfiability, while Moss\'e et al. (2022) proved that for polynomial terms, it is complete for the existential theory of the reals. Pearl's Causal Hierarchy (PCH) extends the probabilistic setting with interventional and counterfactual reasoning, enriching the expressiveness of languages. However, Moss\'e et al. (2022) found that satisfiability complexity remains unchanged. Van der Zander et al. (2023) showed that introducing a marginalization operator to languages induces a significant increase in complexity. We extend this line of work by adding two new dimensions to the problem by constraining the models. First, we fix the graph structure of the underlying structural causal model, motivated by settings like Pearl's do-calculus, and give a nearly complete landscape across different arithmetics and PCH levels. Second, we study small models. While earlier work showed that satisfiable instances admit polynomial-size models, this is no longer guaranteed with compact marginalization. We characterize the complexities of satisfiability under small-model constraints across different settings.
Related papers
- Identifying General Mechanism Shifts in Linear Causal Representations [58.6238439611389]
We consider the linear causal representation learning setting where we observe a linear mixing of $d$ unknown latent factors.
Recent work has shown that it is possible to recover the latent factors as well as the underlying structural causal model over them.
We provide a surprising identifiability result that it is indeed possible, under some very mild standard assumptions, to identify the set of shifted nodes.
arXiv Detail & Related papers (2024-10-31T15:56:50Z) - From Probability to Counterfactuals: the Increasing Complexity of Satisfiability in Pearl's Causal Hierarchy [3.44747819522562]
We show that languages allowing addition and marginalization yield NPPP, PSPACE, and NEXP-complete satisfiability problems, depending on the level of the PCH.<n>On the other hand, in the case of full languages, i.e. allowing addition, marginalization, and multiplication, we show that the satisfiability for the counterfactual level remains the same as for the probabilistic and causal levels.
arXiv Detail & Related papers (2024-05-12T20:25:36Z) - On Probabilistic and Causal Reasoning with Summation Operators [3.1133049660590615]
We show that reasoning in each causal language is as difficult, from a computational complexity perspective, as reasoning in its merely probabilistic or "correlational" counterpart.
We introduce a summation operator to capture common devices that appear in applications, such as the $do$-calculus of Pearl (2009) for causal inference.
Surprisingly, allowing free variables for random variable values results in a system that is undecidable, so long as the ranges of these random variables are unrestricted.
arXiv Detail & Related papers (2024-05-05T22:32:01Z) - Optimal Multi-Distribution Learning [88.3008613028333]
Multi-distribution learning seeks to learn a shared model that minimizes the worst-case risk across $k$ distinct data distributions.
We propose a novel algorithm that yields an varepsilon-optimal randomized hypothesis with a sample complexity on the order of (d+k)/varepsilon2.
arXiv Detail & Related papers (2023-12-08T16:06:29Z) - The Hardness of Reasoning about Probabilities and Causality [5.190207094732672]
We study languages capable of expressing quantitative probabilistic reasoning and do-calculus reasoning for causal effects.
The main contribution of this work is establishing the exact computational complexity of satisfiability problems.
We introduce a new natural complexity class, named succ$exists$R, which can be viewed as a succinct variant of the well-studied class $exists$R.
arXiv Detail & Related papers (2023-05-16T15:01:22Z) - Structure Learning and Parameter Estimation for Graphical Models via
Penalized Maximum Likelihood Methods [0.0]
In the thesis, we consider two different types of PGMs: Bayesian networks (BNs) which are static, and continuous time Bayesian networks which, as the name suggests, have a temporal component.
We are interested in recovering their true structure, which is the first step in learning any PGM.
arXiv Detail & Related papers (2023-01-30T20:26:13Z) - Sample-efficient proper PAC learning with approximate differential
privacy [51.09425023771246]
We prove that the sample complexity of properly learning a class of Littlestone dimension $d$ with approximate differential privacy is $tilde O(d6)$, ignoring privacy and accuracy parameters.
This result answers a question of Bun et al. (FOCS 2020) by improving upon their upper bound of $2O(d)$ on the sample complexity.
arXiv Detail & Related papers (2020-12-07T18:17:46Z) - Causal Expectation-Maximisation [70.45873402967297]
We show that causal inference is NP-hard even in models characterised by polytree-shaped graphs.
We introduce the causal EM algorithm to reconstruct the uncertainty about the latent variables from data about categorical manifest variables.
We argue that there appears to be an unnoticed limitation to the trending idea that counterfactual bounds can often be computed without knowledge of the structural equations.
arXiv Detail & Related papers (2020-11-04T10:25:13Z) - Model Interpretability through the Lens of Computational Complexity [1.6631602844999724]
We study whether folklore interpretability claims have a correlate in terms of computational complexity theory.
We show that both linear and tree-based models are strictly more interpretable than neural networks.
arXiv Detail & Related papers (2020-10-23T09:50:40Z) - From Checking to Inference: Actual Causality Computations as
Optimization Problems [79.87179017975235]
We present a novel approach to formulate different notions of causal reasoning, over binary acyclic models, as optimization problems.
We show that both notions are efficiently automated. Using models with more than $8000$ variables, checking is computed in a matter of seconds, with MaxSAT outperforming ILP in many cases.
arXiv Detail & Related papers (2020-06-05T10:56:52Z) - Hardness of Random Optimization Problems for Boolean Circuits,
Low-Degree Polynomials, and Langevin Dynamics [78.46689176407936]
We show that families of algorithms fail to produce nearly optimal solutions with high probability.
For the case of Boolean circuits, our results improve the state-of-the-art bounds known in circuit complexity theory.
arXiv Detail & Related papers (2020-04-25T05:45:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.