Causal query in observational data with hidden variables
- URL: http://arxiv.org/abs/2001.10269v4
- Date: Tue, 24 Nov 2020 05:11:56 GMT
- Title: Causal query in observational data with hidden variables
- Authors: Debo Cheng (1), Jiuyong Li (1), Lin Liu (1), Jixue Liu (1), Kui Yu
(2), and Thuc Duy Le (1) ((1) School of Information Technology and
Mathematical Sciences, University of South Australia (2) School of Computer
Science and Information Engineering, Hefei University of Technology)
- Abstract summary: We develop a theorem for using local search to find a superset of the adjustment variables for causal effect estimation from observational data.
Based on the developed theorem, we propose a data-driven algorithm for causal query.
Experiments show that the proposed algorithm is faster and produces better causal effect estimation than an existing data-driven causal effect estimation method with hidden variables.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper discusses the problem of causal query in observational data with
hidden variables, with the aim of seeking the change of an outcome when
"manipulating" a variable while given a set of plausible confounding variables
which affect the manipulated variable and the outcome. Such an "experiment on
data" to estimate the causal effect of the manipulated variable is useful for
validating an experiment design using historical data or for exploring
confounders when studying a new relationship. However, existing data-driven
methods for causal effect estimation face some major challenges, including poor
scalability with high dimensional data, low estimation accuracy due to
heuristics used by the global causal structure learning algorithms, and the
assumption of causal sufficiency when hidden variables are inevitable in data.
In this paper, we develop a theorem for using local search to find a superset
of the adjustment (or confounding) variables for causal effect estimation from
observational data under a realistic pretreatment assumption. The theorem
ensures that the unbiased estimate of causal effect is included in the set of
causal effects estimated by the superset of adjustment variables. Based on the
developed theorem, we propose a data-driven algorithm for causal query.
Experiments show that the proposed algorithm is faster and produces better
causal effect estimation than an existing data-driven causal effect estimation
method with hidden variables. The causal effects estimated by the proposed
algorithm are as accurate as those by the state-of-the-art methods using domain
knowledge.
Related papers
- Local Learning for Covariate Selection in Nonparametric Causal Effect Estimation with Latent Variables [13.12743473333296]
Estimating causal effects from nonexperimental data is a fundamental problem in many fields of science.
We propose a novel local learning approach for covariate selection in nonparametric causal effect estimation.
We validate our algorithm through extensive experiments on both synthetic and real-world data.
arXiv Detail & Related papers (2024-11-25T12:08:54Z) - Estimating Causal Effects from Learned Causal Networks [56.14597641617531]
We propose an alternative paradigm for answering causal-effect queries over discrete observable variables.
We learn the causal Bayesian network and its confounding latent variables directly from the observational data.
We show that this emphmodel completion learning approach can be more effective than estimand approaches.
arXiv Detail & Related papers (2024-08-26T08:39:09Z) - Do Finetti: On Causal Effects for Exchangeable Data [45.96632286841583]
We study causal effect estimation in a setting where the data are not i.i.d.
We focus on exchangeable data satisfying an assumption of independent causal mechanisms.
arXiv Detail & Related papers (2024-05-29T07:31:18Z) - Learning to Bound Counterfactual Inference in Structural Causal Models
from Observational and Randomised Data [64.96984404868411]
We derive a likelihood characterisation for the overall data that leads us to extend a previous EM-based algorithm.
The new algorithm learns to approximate the (unidentifiability) region of model parameters from such mixed data sources.
It delivers interval approximations to counterfactual results, which collapse to points in the identifiable case.
arXiv Detail & Related papers (2022-12-06T12:42:11Z) - Causal Effect Estimation using Variational Information Bottleneck [19.6760527269791]
Causal inference is to estimate the causal effect in a causal relationship when intervention is applied.
We propose a method to estimate Causal Effect by using Variational Information Bottleneck (CEVIB)
arXiv Detail & Related papers (2021-10-26T13:46:12Z) - Multi-Source Causal Inference Using Control Variates [81.57072928775509]
We propose a general algorithm to estimate causal effects from emphmultiple data sources.
We show theoretically that this reduces the variance of the ATE estimate.
We apply this framework to inference from observational data under an outcome selection bias.
arXiv Detail & Related papers (2021-03-30T21:20:51Z) - Efficient Causal Inference from Combined Observational and
Interventional Data through Causal Reductions [68.6505592770171]
Unobserved confounding is one of the main challenges when estimating causal effects.
We propose a novel causal reduction method that replaces an arbitrary number of possibly high-dimensional latent confounders.
We propose a learning algorithm to estimate the parameterized reduced model jointly from observational and interventional data.
arXiv Detail & Related papers (2021-03-08T14:29:07Z) - Meta Learning for Causal Direction [29.00522306460408]
We introduce a novel generative model that allows distinguishing cause and effect in the small data setting.
We demonstrate our method on various synthetic as well as real-world data and show that it is able to maintain high accuracy in detecting directions across varying dataset sizes.
arXiv Detail & Related papers (2020-07-06T15:12:05Z) - Stable Prediction via Leveraging Seed Variable [73.9770220107874]
Previous machine learning methods might exploit subtly spurious correlations in training data induced by non-causal variables for prediction.
We propose a conditional independence test based algorithm to separate causal variables with a seed variable as priori, and adopt them for stable prediction.
Our algorithm outperforms state-of-the-art methods for stable prediction.
arXiv Detail & Related papers (2020-06-09T06:56:31Z) - Towards unique and unbiased causal effect estimation from data with
hidden variables [0.0]
Causal effect estimation from observational data is a crucial but challenging task.
We propose an approach to achieving unique and unbiased estimation of causal effects from data with hidden variables.
Based on the theorems, two algorithms are proposed for finding the proper adjustment sets from data with hidden variables.
arXiv Detail & Related papers (2020-02-24T06:42:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.