Probability trees and the value of a single intervention
- URL: http://arxiv.org/abs/2205.08779v1
- Date: Wed, 18 May 2022 08:01:33 GMT
- Title: Probability trees and the value of a single intervention
- Authors: Tue Herlau
- Abstract summary: We quantify the information gain from a single intervention and show that both the anticipated information gain, prior to making an intervention, and the expected gain from an intervention have simple expressions.
This results in an active-learning method that simply selects the intervention with the highest anticipated gain.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The most fundamental problem in statistical causality is determining causal
relationships from limited data. Probability trees, which combine prior causal
structures with Bayesian updates, have been suggested as a possible solution.
In this work, we quantify the information gain from a single intervention and
show that both the anticipated information gain, prior to making an
intervention, and the expected gain from an intervention have simple
expressions. This results in an active-learning method that simply selects the
intervention with the highest anticipated gain, which we illustrate through
several examples. Our work demonstrates how probability trees, and Bayesian
estimation of their parameters, offer a simple yet viable approach to fast
causal induction.
Related papers
- Estimating Causal Effects from Learned Causal Networks [56.14597641617531]
We propose an alternative paradigm for answering causal-effect queries over discrete observable variables.
We learn the causal Bayesian network and its confounding latent variables directly from the observational data.
We show that this emphmodel completion learning approach can be more effective than estimand approaches.
arXiv Detail & Related papers (2024-08-26T08:39:09Z) - Bayesian Intervention Optimization for Causal Discovery [23.51328013481865]
Causal discovery is crucial for understanding complex systems and informing decisions.
Current methods, such as Bayesian and graph-theoretical approaches, do not prioritize decision-making.
We propose a novel Bayesian optimization-based method inspired by Bayes factors.
arXiv Detail & Related papers (2024-06-16T12:45:44Z) - Intervention and Conditioning in Causal Bayesian Networks [23.225006087292765]
We show that by making simple yet often realistic independence assumptions, it is possible to estimate the probability of an interventional formula.
In many cases of interest, when the assumptions are appropriate, these probability estimates can be evaluated using observational data.
arXiv Detail & Related papers (2024-05-23T15:55:38Z) - Nonparametric Identifiability of Causal Representations from Unknown
Interventions [63.1354734978244]
We study causal representation learning, the task of inferring latent causal variables and their causal relations from mixtures of the variables.
Our goal is to identify both the ground truth latents and their causal graph up to a set of ambiguities which we show to be irresolvable from interventional data.
arXiv Detail & Related papers (2023-06-01T10:51:58Z) - Active Bayesian Causal Inference [72.70593653185078]
We propose Active Bayesian Causal Inference (ABCI), a fully-Bayesian active learning framework for integrated causal discovery and reasoning.
ABCI jointly infers a posterior over causal models and queries of interest.
We show that our approach is more data-efficient than several baselines that only focus on learning the full causal graph.
arXiv Detail & Related papers (2022-06-04T22:38:57Z) - BaCaDI: Bayesian Causal Discovery with Unknown Interventions [118.93754590721173]
BaCaDI operates in the continuous space of latent probabilistic representations of both causal structures and interventions.
In experiments on synthetic causal discovery tasks and simulated gene-expression data, BaCaDI outperforms related methods in identifying causal structures and intervention targets.
arXiv Detail & Related papers (2022-06-03T16:25:48Z) - Active learning of causal probability trees [0.0]
We present a method for learning probability trees from a combination of interventional and observational data.
The method quantifies the expected information gain from an intervention, and selects the interventions with the largest gain.
arXiv Detail & Related papers (2022-05-17T08:56:34Z) - Differentiable Causal Discovery Under Latent Interventions [3.867363075280544]
Recent work has shown promising results in causal discovery by leveraging interventional data with gradient-based methods, even when the intervened variables are unknown.
We envision a scenario with an extensive dataset sampled from multiple intervention distributions and one observation distribution, but where we do not know which distribution originated each sample and how the intervention affected the system.
We propose a method based on neural networks and variational inference that addresses this scenario by framing it as learning a shared causal graph among an infinite mixture.
arXiv Detail & Related papers (2022-03-04T14:21:28Z) - BayesIMP: Uncertainty Quantification for Causal Data Fusion [52.184885680729224]
We study the causal data fusion problem, where datasets pertaining to multiple causal graphs are combined to estimate the average treatment effect of a target variable.
We introduce a framework which combines ideas from probabilistic integration and kernel mean embeddings to represent interventional distributions in the reproducing kernel Hilbert space.
arXiv Detail & Related papers (2021-06-07T10:14:18Z) - Efficient Causal Inference from Combined Observational and
Interventional Data through Causal Reductions [68.6505592770171]
Unobserved confounding is one of the main challenges when estimating causal effects.
We propose a novel causal reduction method that replaces an arbitrary number of possibly high-dimensional latent confounders.
We propose a learning algorithm to estimate the parameterized reduced model jointly from observational and interventional data.
arXiv Detail & Related papers (2021-03-08T14:29:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.