Causal Feature Selection via Orthogonal Search
- URL: http://arxiv.org/abs/2007.02938v3
- Date: Sat, 17 Sep 2022 01:10:10 GMT
- Title: Causal Feature Selection via Orthogonal Search
- Authors: Ashkan Soleymani, Anant Raj, Stefan Bauer, Bernhard Sch\"olkopf and
Michel Besserve
- Abstract summary: We study a one-vs.-the-rest feature selection approach to discover the direct causal parent of a response variable.
We propose an algorithm that works for purely observational data while also offering theoretical guarantees.
- Score: 30.120592913076198
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The problem of inferring the direct causal parents of a response variable
among a large set of explanatory variables is of high practical importance in
many disciplines. However, established approaches often scale at least
exponentially with the number of explanatory variables, are difficult to extend
to nonlinear relationships, and are difficult to extend to cyclic data.
Inspired by {\em Debiased} machine learning methods, we study a
one-vs.-the-rest feature selection approach to discover the direct causal
parent of the response. We propose an algorithm that works for purely
observational data while also offering theoretical guarantees, including the
case of partially nonlinear relationships possibly under the presence of
cycles. As it requires only one estimation for each variable, our approach is
applicable even to large graphs. We demonstrate significant improvements
compared to established approaches.
Related papers
- Learning Directed Acyclic Graphs from Partial Orderings [9.387234607473054]
directed acyclic graphs (DAGs) are commonly used to model causal relationships among random variables.
In this paper, we consider the intermediate problem of learning DAGs when a partial causal ordering of variables is available.
We propose a general estimation framework for leveraging the partial ordering and present efficient estimation algorithms for low- and high-dimensional problems.
arXiv Detail & Related papers (2024-03-24T06:14:50Z) - Learning Linear Causal Representations from Interventions under General
Nonlinear Mixing [52.66151568785088]
We prove strong identifiability results given unknown single-node interventions without access to the intervention targets.
This is the first instance of causal identifiability from non-paired interventions for deep neural network embeddings.
arXiv Detail & Related papers (2023-06-04T02:32:12Z) - Nonparametric Identifiability of Causal Representations from Unknown
Interventions [63.1354734978244]
We study causal representation learning, the task of inferring latent causal variables and their causal relations from mixtures of the variables.
Our goal is to identify both the ground truth latents and their causal graph up to a set of ambiguities which we show to be irresolvable from interventional data.
arXiv Detail & Related papers (2023-06-01T10:51:58Z) - Causal discovery under a confounder blanket [9.196779204457059]
Inferring causal relationships from observational data is rarely straightforward, but the problem is especially difficult in high dimensions.
We relax these assumptions and focus on an important but more specialized problem, namely recovering a directed acyclic subgraph.
We derive a complete algorithm for identifying causal relationships under these conditions and implement testing procedures.
arXiv Detail & Related papers (2022-05-11T18:10:45Z) - Variational Causal Networks: Approximate Bayesian Inference over Causal
Structures [132.74509389517203]
We introduce a parametric variational family modelled by an autoregressive distribution over the space of discrete DAGs.
In experiments, we demonstrate that the proposed variational posterior is able to provide a good approximation of the true posterior.
arXiv Detail & Related papers (2021-06-14T17:52:49Z) - Deconfounded Score Method: Scoring DAGs with Dense Unobserved
Confounding [101.35070661471124]
We show that unobserved confounding leaves a characteristic footprint in the observed data distribution that allows for disentangling spurious and causal effects.
We propose an adjusted score-based causal discovery algorithm that may be implemented with general-purpose solvers and scales to high-dimensional problems.
arXiv Detail & Related papers (2021-03-28T11:07:59Z) - Nonlinear Invariant Risk Minimization: A Causal Approach [5.63479133344366]
We propose a learning paradigm that enables out-of-distribution generalization in the nonlinear setting.
We show identifiability of the data representation up to very simple transformations.
Extensive experiments on both synthetic and real-world datasets show that our approach significantly outperforms a variety of baseline methods.
arXiv Detail & Related papers (2021-02-24T15:38:41Z) - Disentangling Observed Causal Effects from Latent Confounders using
Method of Moments [67.27068846108047]
We provide guarantees on identifiability and learnability under mild assumptions.
We develop efficient algorithms based on coupled tensor decomposition with linear constraints to obtain scalable and guaranteed solutions.
arXiv Detail & Related papers (2021-01-17T07:48:45Z) - Differentiable Causal Discovery from Interventional Data [141.41931444927184]
We propose a theoretically-grounded method based on neural networks that can leverage interventional data.
We show that our approach compares favorably to the state of the art in a variety of settings.
arXiv Detail & Related papers (2020-07-03T15:19:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.