Distributionally Robust Causal Inference with Observational Data
- URL: http://arxiv.org/abs/2210.08326v1
- Date: Sat, 15 Oct 2022 16:02:33 GMT
- Title: Distributionally Robust Causal Inference with Observational Data
- Authors: Dimitris Bertsimas, Kosuke Imai, Michael Lingzhi Li
- Abstract summary: We consider the estimation of average treatment effects in observational studies without the standard assumption of unconfoundedness.
We propose a new framework of robust causal inference under the general observational study setting with the possible existence of unobserved confounders.
- Score: 4.8986598953553555
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the estimation of average treatment effects in observational
studies without the standard assumption of unconfoundedness. We propose a new
framework of robust causal inference under the general observational study
setting with the possible existence of unobserved confounders. Our approach is
based on the method of distributionally robust optimization and proceeds in two
steps. We first specify the maximal degree to which the distribution of
unobserved potential outcomes may deviate from that of obsered outcomes. We
then derive sharp bounds on the average treatment effects under this
assumption. Our framework encompasses the popular marginal sensitivity model as
a special case and can be extended to the difference-in-difference and
regression discontinuity designs as well as instrumental variables. Through
simulation and empirical studies, we demonstrate the applicability of the
proposed methodology to real-world settings.
Related papers
- Model-free Methods for Event History Analysis and Efficient Adjustment (PhD Thesis) [55.2480439325792]
This thesis is a series of independent contributions to statistics unified by a model-free perspective.
The first chapter elaborates on how a model-free perspective can be used to formulate flexible methods that leverage prediction techniques from machine learning.
The second chapter studies the concept of local independence, which describes whether the evolution of one process is directly influenced by another.
arXiv Detail & Related papers (2025-02-11T19:24:09Z) - Causal Lifting of Neural Representations: Zero-Shot Generalization for Causal Inferences [56.23412698865433]
We focus on causal inferences on a target experiment with unlabeled factual outcomes, retrieved by a predictive model fine-tuned on a labeled similar experiment.
First, we show that factual outcome estimation via Empirical Risk Minimization (ERM) may fail to yield valid causal inferences on the target population.
We propose Deconfounded Empirical Risk Minimization (DERM), a new simple learning procedure minimizing the risk over a fictitious target population.
arXiv Detail & Related papers (2025-02-10T10:52:17Z) - Exogenous Matching: Learning Good Proposals for Tractable Counterfactual Estimation [1.9662978733004601]
We propose an importance sampling method for tractable and efficient estimation of counterfactual expressions.
By minimizing a common upper bound of counterfactual estimators, we transform the variance minimization problem into a conditional distribution learning problem.
We validate the theoretical results through experiments under various types and settings of Structural Causal Models (SCMs) and demonstrate the outperformance on counterfactual estimation tasks.
arXiv Detail & Related papers (2024-10-17T03:08:28Z) - Conformal Diffusion Models for Individual Treatment Effect Estimation and Inference [6.406853903837333]
Individual treatment effect offers the most granular measure of treatment effect on an individual level.
We propose a novel conformal diffusion model-based approach that addresses those intricate challenges.
arXiv Detail & Related papers (2024-08-02T21:35:08Z) - Model Free Prediction with Uncertainty Assessment [7.524024486998338]
We propose a novel framework that transforms the deep estimation paradigm into a platform conducive to conditional mean estimation.
We develop an end-to-end convergence rate for the conditional diffusion model and establish the normality of the generated samples.
Through numerical experiments, we empirically validate the efficacy of our proposed methodology.
arXiv Detail & Related papers (2024-05-21T11:19:50Z) - Conformal Counterfactual Inference under Hidden Confounding [19.190396053530417]
Predicting potential outcomes along with its uncertainty in a counterfactual world poses the foundamental challenge in causal inference.
Existing methods that construct confidence intervals for counterfactuals either rely on the assumption of strong ignorability.
We propose a novel approach based on transductive weighted conformal prediction, which provides confidence intervals for counterfactual outcomes with marginal converage guarantees.
arXiv Detail & Related papers (2024-05-20T21:43:43Z) - Targeted Machine Learning for Average Causal Effect Estimation Using the
Front-Door Functional [3.0232957374216953]
evaluating the average causal effect (ACE) of a treatment on an outcome often involves overcoming the challenges posed by confounding factors in observational studies.
Here, we introduce novel estimation strategies for the front-door criterion based on the targeted minimum loss-based estimation theory.
We demonstrate the applicability of these estimators to analyze the effect of early stage academic performance on future yearly income.
arXiv Detail & Related papers (2023-12-15T22:04:53Z) - A Double Machine Learning Approach to Combining Experimental and Observational Data [59.29868677652324]
We propose a double machine learning approach to combine experimental and observational studies.
Our framework tests for violations of external validity and ignorability under milder assumptions.
arXiv Detail & Related papers (2023-07-04T02:53:11Z) - Advancing Counterfactual Inference through Nonlinear Quantile Regression [77.28323341329461]
We propose a framework for efficient and effective counterfactual inference implemented with neural networks.
The proposed approach enhances the capacity to generalize estimated counterfactual outcomes to unseen data.
Empirical results conducted on multiple datasets offer compelling support for our theoretical assertions.
arXiv Detail & Related papers (2023-06-09T08:30:51Z) - Counterfactual Maximum Likelihood Estimation for Training Deep Networks [83.44219640437657]
Deep learning models are prone to learning spurious correlations that should not be learned as predictive clues.
We propose a causality-based training framework to reduce the spurious correlations caused by observable confounders.
We conduct experiments on two real-world tasks: Natural Language Inference (NLI) and Image Captioning.
arXiv Detail & Related papers (2021-06-07T17:47:16Z) - GenDICE: Generalized Offline Estimation of Stationary Values [108.17309783125398]
We show that effective estimation can still be achieved in important applications.
Our approach is based on estimating a ratio that corrects for the discrepancy between the stationary and empirical distributions.
The resulting algorithm, GenDICE, is straightforward and effective.
arXiv Detail & Related papers (2020-02-21T00:27:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.