Causal discovery under a confounder blanket
- URL: http://arxiv.org/abs/2205.05715v1
- Date: Wed, 11 May 2022 18:10:45 GMT
- Title: Causal discovery under a confounder blanket
- Authors: David Watson and Ricardo Silva
- Abstract summary: Inferring causal relationships from observational data is rarely straightforward, but the problem is especially difficult in high dimensions.
We relax these assumptions and focus on an important but more specialized problem, namely recovering a directed acyclic subgraph.
We derive a complete algorithm for identifying causal relationships under these conditions and implement testing procedures.
- Score: 9.196779204457059
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Inferring causal relationships from observational data is rarely
straightforward, but the problem is especially difficult in high dimensions.
For these applications, causal discovery algorithms typically require
parametric restrictions or extreme sparsity constraints. We relax these
assumptions and focus on an important but more specialized problem, namely
recovering a directed acyclic subgraph of variables known to be causally
descended from some (possibly large) set of confounding covariates, i.e. a
$\textit{confounder blanket}$. This is useful in many settings, for example
when studying a dynamic biomolecular subsystem with genetic data providing
causally relevant background information. Under a structural assumption that,
we argue, must be satisfied in practice if informative answers are to be found,
our method accommodates graphs of low or high sparsity while maintaining
polynomial time complexity. We derive a sound and complete algorithm for
identifying causal relationships under these conditions and implement testing
procedures with provable error control for linear and nonlinear systems. We
demonstrate our approach on a range of simulation settings.
Related papers
- On the Identification of Temporally Causal Representation with Instantaneous Dependence [50.14432597910128]
Temporally causal representation learning aims to identify the latent causal process from time series observations.
Most methods require the assumption that the latent causal processes do not have instantaneous relations.
We propose an textbfIDentification framework for instantanetextbfOus textbfLatent dynamics.
arXiv Detail & Related papers (2024-05-24T08:08:05Z) - Hybrid Top-Down Global Causal Discovery with Local Search for Linear and Nonlinear Additive Noise Models [2.0738462952016232]
Methods based on functional causal models can identify a unique graph, but suffer from the curse of dimensionality or impose strong parametric assumptions.
We propose a novel hybrid approach for global causal discovery in observational data that leverages local causal substructures.
We provide theoretical guarantees for correctness and worst-case time complexities, with empirical validation on synthetic data.
arXiv Detail & Related papers (2024-05-23T12:28:16Z) - Heteroscedastic Causal Structure Learning [2.566492438263125]
We tackle the heteroscedastic causal structure learning problem under Gaussian noises.
By exploiting the normality of the causal mechanisms, we can recover a valid causal ordering.
The result is HOST (Heteroscedastic causal STructure learning), a simple yet effective causal structure learning algorithm.
arXiv Detail & Related papers (2023-07-16T07:53:16Z) - Causal discovery for time series with constraint-based model and PMIME
measure [0.0]
We present a novel approach for discovering causality in time series data that combines a causal discovery algorithm with an information theoretic-based measure.
We evaluate the performance of our approach on several simulated data sets, showing promising results.
arXiv Detail & Related papers (2023-05-31T09:38:50Z) - Improving Efficiency and Accuracy of Causal Discovery Using a
Hierarchical Wrapper [7.570246812206772]
Causal discovery from observational data is an important tool in many branches of science.
In the large sample limit, sound and complete causal discovery algorithms have been previously introduced.
However, only finite training data is available, which limits the power of statistical tests used by these algorithms.
arXiv Detail & Related papers (2021-07-11T09:24:49Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Deconfounded Score Method: Scoring DAGs with Dense Unobserved
Confounding [101.35070661471124]
We show that unobserved confounding leaves a characteristic footprint in the observed data distribution that allows for disentangling spurious and causal effects.
We propose an adjusted score-based causal discovery algorithm that may be implemented with general-purpose solvers and scales to high-dimensional problems.
arXiv Detail & Related papers (2021-03-28T11:07:59Z) - Disentangling Observed Causal Effects from Latent Confounders using
Method of Moments [67.27068846108047]
We provide guarantees on identifiability and learnability under mild assumptions.
We develop efficient algorithms based on coupled tensor decomposition with linear constraints to obtain scalable and guaranteed solutions.
arXiv Detail & Related papers (2021-01-17T07:48:45Z) - Causal Expectation-Maximisation [70.45873402967297]
We show that causal inference is NP-hard even in models characterised by polytree-shaped graphs.
We introduce the causal EM algorithm to reconstruct the uncertainty about the latent variables from data about categorical manifest variables.
We argue that there appears to be an unnoticed limitation to the trending idea that counterfactual bounds can often be computed without knowledge of the structural equations.
arXiv Detail & Related papers (2020-11-04T10:25:13Z) - Neural Additive Vector Autoregression Models for Causal Discovery in
Time Series [1.160208922584163]
We propose a neural approach to causal structure learning that can discover nonlinear relationships.
We train deep neural networks that extract the (additive) Granger causal influences from the time evolution in time series.
The method achieves state-of-the-art results on various benchmark data sets for causal discovery.
arXiv Detail & Related papers (2020-10-19T12:44:25Z) - Computational Barriers to Estimation from Low-Degree Polynomials [81.67886161671379]
We study the power of low-degrees for the task of detecting the presence of hidden structures.
For a large class of "signal plus noise" problems, we give a user-friendly lower bound for the best possible mean squared error achievable by any degree.
As applications, we give a tight characterization of the low-degree minimum mean squared error for the planted submatrix and planted dense subgraph problems.
arXiv Detail & Related papers (2020-08-05T17:52:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.