Inverting estimating equations for causal inference on quantiles
- URL: http://arxiv.org/abs/2401.00987v1
- Date: Tue, 2 Jan 2024 01:52:28 GMT
- Title: Inverting estimating equations for causal inference on quantiles
- Authors: Chao Cheng, Fan Li
- Abstract summary: We generalize a class of causal inference solutions from estimating the mean of the potential outcome to its quantiles.
A broad implication of our results is that one can rework the existing result for mean causal estimands to facilitate causal inference on quantiles.
- Score: 9.216100284591636
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The causal inference literature frequently focuses on estimating the mean of
the potential outcome, whereas the quantiles of the potential outcome may carry
important additional information. We propose a universal approach, based on the
inverse estimating equations, to generalize a wide class of causal inference
solutions from estimating the mean of the potential outcome to its quantiles.
We assume that an identifying moment function is available to identify the mean
of the threshold-transformed potential outcome, based on which a convenient
construction of the estimating equation of quantiles of potential outcome is
proposed. In addition, we also give a general construction of the efficient
influence functions of the mean and quantiles of potential outcomes, and
identify their connection. We motivate estimators for the quantile estimands
with the efficient influence function, and develop their asymptotic properties
when either parametric models or data-adaptive machine learners are used to
estimate the nuisance functions. A broad implication of our results is that one
can rework the existing result for mean causal estimands to facilitate causal
inference on quantiles, rather than starting from scratch. Our results are
illustrated by several examples.
Related papers
- Bayesian Quantile Regression with Subset Selection: A Posterior Summarization Perspective [0.0]
Existing methods estimate conditional quantiles separately for each quantile of interest or estimate the entire conditional distribution using semi- or non-parametric models.
We pose the fundamental problems of linear quantile estimation, uncertainty quantification, and subset selection from a Bayesian decision analysis perspective.
Our approach introduces a quantile-focused squared error loss, which enables efficient, closed-form computing and maintains a close relationship with Wasserstein-based density estimation.
arXiv Detail & Related papers (2023-11-03T17:19:31Z) - Postselection-free learning of measurement-induced quantum dynamics [0.0]
We introduce a general-purpose scheme that can be used to infer any property of the post-measurement ensemble of states.
As an immediate application, we show that our method can be used to verify the emergence of quantum state designs in experiments.
arXiv Detail & Related papers (2023-10-06T11:06:06Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - Advancing Counterfactual Inference through Nonlinear Quantile Regression [77.28323341329461]
We propose a framework for efficient and effective counterfactual inference implemented with neural networks.
The proposed approach enhances the capacity to generalize estimated counterfactual outcomes to unseen data.
Empirical results conducted on multiple datasets offer compelling support for our theoretical assertions.
arXiv Detail & Related papers (2023-06-09T08:30:51Z) - Feature selection in stratification estimators of causal effects:
lessons from potential outcomes, causal diagrams, and structural equations [0.456877715768796]
This approach clarifies the fundamental statistical phenomena underlying many widely-cited results.
Our exposition combines insights from three distinct methodological traditions for studying causal effect estimation.
arXiv Detail & Related papers (2022-09-23T04:20:50Z) - Data-Driven Influence Functions for Optimization-Based Causal Inference [105.5385525290466]
We study a constructive algorithm that approximates Gateaux derivatives for statistical functionals by finite differencing.
We study the case where probability distributions are not known a priori but need to be estimated from data.
arXiv Detail & Related papers (2022-08-29T16:16:22Z) - Semi-Supervised Quantile Estimation: Robust and Efficient Inference in
High Dimensional Settings [0.07031569227782805]
We consider quantile estimation in a semi-supervised setting, characterized by two available data sets.
We propose a family of semi-supervised estimators for the response quantile(s) based on the two data sets.
arXiv Detail & Related papers (2022-01-25T10:02:23Z) - BayesIMP: Uncertainty Quantification for Causal Data Fusion [52.184885680729224]
We study the causal data fusion problem, where datasets pertaining to multiple causal graphs are combined to estimate the average treatment effect of a target variable.
We introduce a framework which combines ideas from probabilistic integration and kernel mean embeddings to represent interventional distributions in the reproducing kernel Hilbert space.
arXiv Detail & Related papers (2021-06-07T10:14:18Z) - Loss Bounds for Approximate Influence-Based Abstraction [81.13024471616417]
Influence-based abstraction aims to gain leverage by modeling local subproblems together with the 'influence' that the rest of the system exerts on them.
This paper investigates the performance of such approaches from a theoretical perspective.
We show that neural networks trained with cross entropy are well suited to learn approximate influence representations.
arXiv Detail & Related papers (2020-11-03T15:33:10Z) - Machine learning for causal inference: on the use of cross-fit
estimators [77.34726150561087]
Doubly-robust cross-fit estimators have been proposed to yield better statistical properties.
We conducted a simulation study to assess the performance of several estimators for the average causal effect (ACE)
When used with machine learning, the doubly-robust cross-fit estimators substantially outperformed all of the other estimators in terms of bias, variance, and confidence interval coverage.
arXiv Detail & Related papers (2020-04-21T23:09:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.