Hybrid Top-Down Global Causal Discovery with Local Search for Linear and Nonlinear Additive Noise Models
- URL: http://arxiv.org/abs/2405.14496v4
- Date: Thu, 07 Nov 2024 22:37:35 GMT
- Title: Hybrid Top-Down Global Causal Discovery with Local Search for Linear and Nonlinear Additive Noise Models
- Authors: Sujai Hiremath, Jacqueline R. M. A. Maasch, Mengxiao Gao, Promit Ghosal, Kyra Gan,
- Abstract summary: Methods based on functional causal models can identify a unique graph, but suffer from the curse of dimensionality or impose strong parametric assumptions.
We propose a novel hybrid approach for global causal discovery in observational data that leverages local causal substructures.
We provide theoretical guarantees for correctness and worst-case time complexities, with empirical validation on synthetic data.
- Score: 2.0738462952016232
- License:
- Abstract: Learning the unique directed acyclic graph corresponding to an unknown causal model is a challenging task. Methods based on functional causal models can identify a unique graph, but either suffer from the curse of dimensionality or impose strong parametric assumptions. To address these challenges, we propose a novel hybrid approach for global causal discovery in observational data that leverages local causal substructures. We first present a topological sorting algorithm that leverages ancestral relationships in linear structural equation models to establish a compact top-down hierarchical ordering, encoding more causal information than linear orderings produced by existing methods. We demonstrate that this approach generalizes to nonlinear settings with arbitrary noise. We then introduce a nonparametric constraint-based algorithm that prunes spurious edges by searching for local conditioning sets, achieving greater accuracy than current methods. We provide theoretical guarantees for correctness and worst-case polynomial time complexities, with empirical validation on synthetic data.
Related papers
- Induced Covariance for Causal Discovery in Linear Sparse Structures [55.2480439325792]
Causal models seek to unravel the cause-effect relationships among variables from observed data.
This paper introduces a novel causal discovery algorithm designed for settings in which variables exhibit linearly sparse relationships.
arXiv Detail & Related papers (2024-10-02T04:01:38Z) - Kernel-Based Differentiable Learning of Non-Parametric Directed Acyclic Graphical Models [17.52142371968811]
Causal discovery amounts to learning a directed acyclic graph (DAG) that encodes a causal model.
Recent research has sought to bypass the search by reformulating causal discovery as a continuous optimization problem.
arXiv Detail & Related papers (2024-08-20T16:09:40Z) - Global Convergence of Online Identification for Mixed Linear Regression [1.9295130374196499]
Mixed linear regression (MLR) is a powerful model for characterizing nonlinear relationships.
This paper investigates the online identification and data clustering problems for two basic classes of MLRs.
It introduces two corresponding new online identification algorithms based on the expectation-maximization principle.
arXiv Detail & Related papers (2023-11-30T12:30:42Z) - Amortized Inference for Causal Structure Learning [72.84105256353801]
Learning causal structure poses a search problem that typically involves evaluating structures using a score or independence test.
We train a variational inference model to predict the causal structure from observational/interventional data.
Our models exhibit robust generalization capabilities under substantial distribution shift.
arXiv Detail & Related papers (2022-05-25T17:37:08Z) - Score matching enables causal discovery of nonlinear additive noise
models [63.93669924730725]
We show how to design a new generation of scalable causal discovery methods.
We propose a new efficient method for approximating the score's Jacobian, enabling to recover the causal graph.
arXiv Detail & Related papers (2022-03-08T21:34:46Z) - Sequential Learning of the Topological Ordering for the Linear
Non-Gaussian Acyclic Model with Parametric Noise [6.866717993664787]
We develop a novel sequential approach to estimate the causal ordering of a DAG.
We provide extensive numerical evidence to demonstrate that our procedure is scalable to cases with possibly thousands of nodes.
arXiv Detail & Related papers (2022-02-03T18:15:48Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z) - Random Feature Approximation for Online Nonlinear Graph Topology
Identification [7.992550355579789]
We propose a kernel-based algorithm for graph topology estimation.
We exploit the fact that the real-world networks often exhibit sparse topologies.
The experiments conducted on real and synthetic data show that the proposed method outperforms its competitors.
arXiv Detail & Related papers (2021-10-19T12:48:12Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z) - Regularization of Mixture Models for Robust Principal Graph Learning [0.0]
A regularized version of Mixture Models is proposed to learn a principal graph from a distribution of $D$-dimensional data points.
Parameters of the model are iteratively estimated through an Expectation-Maximization procedure.
arXiv Detail & Related papers (2021-06-16T18:00:02Z) - Learning Mixtures of Low-Rank Models [89.39877968115833]
We study the problem of learning computational mixtures of low-rank models.
We develop an algorithm that is guaranteed to recover the unknown matrices with near-optimal sample.
In addition, the proposed algorithm is provably stable against random noise.
arXiv Detail & Related papers (2020-09-23T17:53:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.