The resource theory of causal influence and knowledge of causal influence
- URL: http://arxiv.org/abs/2512.11209v2
- Date: Fri, 09 Jan 2026 18:43:10 GMT
- Title: The resource theory of causal influence and knowledge of causal influence
- Authors: Marina Maciel Ansanelli, Beata Zjawin, David Schmid, Yìlè Yīng, John H. Selby, Ciarán M. Gilligan-Lee, Ana Belén Sainz, Robert W. Spekkens,
- Abstract summary: We develop a resource-theoretic framework to understand causal relationships.<n>We focus on the simplest nontrivial setting -- two variables that are causally ordered.<n>We identify a triple of monotones that are complete in the sense that they capture the partial order over the set of all resources.
- Score: 1.0539847330971805
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Understanding and quantifying causal relationships between variables is essential for reasoning about the physical world. In this work, we develop a resource-theoretic framework to do so. Here, we focus on the simplest nontrivial setting -- two variables that are causally ordered, meaning that the first has the potential to influence the second, without hidden confounding. First, we introduce the resource theory that directly quantifies causal influence of a functional dependence in this setting and show that the problem of deciding convertibility of resources and identifying a complete set of monotones has a relatively straightforward solution. Following this, we introduce the resource theory that arises naturally when one has uncertainty about the functional dependence. We describe a linear program for deciding the question of whether one resource (i.e., state of knowledge about the functional dependence) can be converted to another. Then, we focus on the case where the variables are binary. In this case, we identify a triple of monotones that are complete in the sense that they capture the partial order over the set of all resources, and we provide an interpretation of each.
Related papers
- Unsupervised Pairwise Causal Discovery on Heterogeneous Data using Mutual Information Measures [49.1574468325115]
Causal Discovery is a technique that tackles the challenge by analyzing the statistical properties of the constituent variables.
We question the current (possibly misleading) baseline results on the basis that they were obtained through supervised learning.
In consequence, we approach this problem in an unsupervised way, using robust Mutual Information measures.
arXiv Detail & Related papers (2024-08-01T09:11:08Z) - Conceptual and formal groundwork for the study of resource dependence relations [0.0]
A resource theory imposes a preorder over states, with one state being above another if the first can be converted to the second by a free operation.
It follows that there can be nontrivial dependence relations between different notions of resourcefulness.
arXiv Detail & Related papers (2024-06-28T18:04:31Z) - Nonparametric Partial Disentanglement via Mechanism Sparsity: Sparse
Actions, Interventions and Sparse Temporal Dependencies [58.179981892921056]
This work introduces a novel principle for disentanglement we call mechanism sparsity regularization.
We propose a representation learning method that induces disentanglement by simultaneously learning the latent factors.
We show that the latent factors can be recovered by regularizing the learned causal graph to be sparse.
arXiv Detail & Related papers (2024-01-10T02:38:21Z) - Nonparametric Identifiability of Causal Representations from Unknown
Interventions [63.1354734978244]
We study causal representation learning, the task of inferring latent causal variables and their causal relations from mixtures of the variables.
Our goal is to identify both the ground truth latents and their causal graph up to a set of ambiguities which we show to be irresolvable from interventional data.
arXiv Detail & Related papers (2023-06-01T10:51:58Z) - Causal Deep Learning [77.49632479298745]
Causality has the potential to transform the way we solve real-world problems.
But causality often requires crucial assumptions which cannot be tested in practice.
We propose a new way of thinking about causality -- we call this causal deep learning.
arXiv Detail & Related papers (2023-03-03T19:19:18Z) - Quantifying Feature Contributions to Overall Disparity Using Information
Theory [24.61791450920249]
When a machine-learning algorithm makes biased decisions, it can be helpful to understand the sources of disparity to explain why the bias exists.
We ask the question: what is the "potential" contribution of each individual feature to the observed disparity in the decisions when the exact decision-making mechanism is not accessible?
When unable to intervene on the inputs, we quantify the "redundant" statistical dependency about the protected attribute that is present in both the final decision and an individual feature.
arXiv Detail & Related papers (2022-06-16T21:27:22Z) - Causal Discovery in Linear Structural Causal Models with Deterministic
Relations [27.06618125828978]
We focus on the task of causal discovery form observational data.
We derive a set of necessary and sufficient conditions for unique identifiability of the causal structure.
arXiv Detail & Related papers (2021-10-30T21:32:42Z) - Resource theory of causal connection [0.5735035463793007]
We build a fully fledged resource theory of causal connection for all multi-party communication scenarios.
We identify the most resourceful processes in the bipartite and tripartite scenarios.
Finally, we introduce a resource theory of causal non-separability, and show that it is -- in contrast to the case of causal connection -- unique.
arXiv Detail & Related papers (2021-10-07T07:33:39Z) - Disentangling Observed Causal Effects from Latent Confounders using
Method of Moments [67.27068846108047]
We provide guarantees on identifiability and learnability under mild assumptions.
We develop efficient algorithms based on coupled tensor decomposition with linear constraints to obtain scalable and guaranteed solutions.
arXiv Detail & Related papers (2021-01-17T07:48:45Z) - A Weaker Faithfulness Assumption based on Triple Interactions [89.59955143854556]
We propose a weaker assumption that we call $2$-adjacency faithfulness.
We propose a sound orientation rule for causal discovery that applies under weaker assumptions.
arXiv Detail & Related papers (2020-10-27T13:04:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.