NODAGS-Flow: Nonlinear Cyclic Causal Structure Learning
- URL: http://arxiv.org/abs/2301.01849v1
- Date: Wed, 4 Jan 2023 23:28:18 GMT
- Title: NODAGS-Flow: Nonlinear Cyclic Causal Structure Learning
- Authors: Muralikrishnna G. Sethuraman, Romain Lopez, Rahul Mohan, Faramarz
Fekri, Tommaso Biancalani, Jan-Christian H\"utter
- Abstract summary: We propose a novel framework for learning nonlinear cyclic causal models from interventional data, called NODAGS-Flow.
We show significant performance improvements with our approach compared to state-of-the-art methods with respect to structure recovery and predictive performance.
- Score: 8.20217860574125
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning causal relationships between variables is a well-studied problem in
statistics, with many important applications in science. However, modeling
real-world systems remain challenging, as most existing algorithms assume that
the underlying causal graph is acyclic. While this is a convenient framework
for developing theoretical developments about causal reasoning and inference,
the underlying modeling assumption is likely to be violated in real systems,
because feedback loops are common (e.g., in biological systems). Although a few
methods search for cyclic causal models, they usually rely on some form of
linearity, which is also limiting, or lack a clear underlying probabilistic
model. In this work, we propose a novel framework for learning nonlinear cyclic
causal graphical models from interventional data, called NODAGS-Flow. We
perform inference via direct likelihood optimization, employing techniques from
residual normalizing flows for likelihood estimation. Through synthetic
experiments and an application to single-cell high-content perturbation
screening data, we show significant performance improvements with our approach
compared to state-of-the-art methods with respect to structure recovery and
predictive performance.
Related papers
- No Equations Needed: Learning System Dynamics Without Relying on Closed-Form ODEs [56.78271181959529]
This paper proposes a conceptual shift to modeling low-dimensional dynamical systems by departing from the traditional two-step modeling process.
Instead of first discovering a closed-form equation and then analyzing it, our approach, direct semantic modeling, predicts the semantic representation of the dynamical system.
Our approach not only simplifies the modeling pipeline but also enhances the transparency and flexibility of the resulting models.
arXiv Detail & Related papers (2025-01-30T18:36:48Z) - Learning Structural Causal Models from Ordering: Identifiable Flow Models [19.99352354910655]
We introduce a set of flow models that can recover component-wise, invertible transformation of variables.
We propose design improvements that enable simultaneous learning of all causal mechanisms.
Our method achieves a significant reduction in computational time compared to existing diffusion-based techniques.
arXiv Detail & Related papers (2024-12-13T04:25:56Z) - Differentiable Causal Discovery For Latent Hierarchical Causal Models [19.373348700715578]
We present new theoretical results on the identifiability of nonlinear latent hierarchical causal models.
We develop a novel differentiable causal discovery algorithm that efficiently estimates the structure of such models.
arXiv Detail & Related papers (2024-11-29T09:08:20Z) - Learning Cyclic Causal Models from Incomplete Data [13.69726643902085]
We propose a novel framework, named MissNODAGS, for learning cyclic causal graphs from partially missing data.
Under the additive noise model, MissNODAGS learns the causal graph by alternating between imputing the missing data and maximizing the expected log-likelihood of the visible part of the data.
We demonstrate improved performance when compared to using state-of-the-art imputation techniques followed by causal learning on partially missing interventional data.
arXiv Detail & Related papers (2024-02-23T22:03:12Z) - A PAC-Bayesian Perspective on the Interpolating Information Criterion [54.548058449535155]
We show how a PAC-Bayes bound is obtained for a general class of models, characterizing factors which influence performance in the interpolating regime.
We quantify how the test error for overparameterized models achieving effectively zero training error depends on the quality of the implicit regularization imposed by e.g. the combination of model, parameter-initialization scheme.
arXiv Detail & Related papers (2023-11-13T01:48:08Z) - SLEM: Machine Learning for Path Modeling and Causal Inference with Super
Learner Equation Modeling [3.988614978933934]
Causal inference is a crucial goal of science, enabling researchers to arrive at meaningful conclusions using observational data.
Path models, Structural Equation Models (SEMs) and Directed Acyclic Graphs (DAGs) provide a means to unambiguously specify assumptions regarding the causal structure underlying a phenomenon.
We propose Super Learner Equation Modeling, a path modeling technique integrating machine learning Super Learner ensembles.
arXiv Detail & Related papers (2023-08-08T16:04:42Z) - Advancing Counterfactual Inference through Nonlinear Quantile Regression [77.28323341329461]
We propose a framework for efficient and effective counterfactual inference implemented with neural networks.
The proposed approach enhances the capacity to generalize estimated counterfactual outcomes to unseen data.
Empirical results conducted on multiple datasets offer compelling support for our theoretical assertions.
arXiv Detail & Related papers (2023-06-09T08:30:51Z) - Estimation of Bivariate Structural Causal Models by Variational Gaussian
Process Regression Under Likelihoods Parametrised by Normalising Flows [74.85071867225533]
Causal mechanisms can be described by structural causal models.
One major drawback of state-of-the-art artificial intelligence is its lack of explainability.
arXiv Detail & Related papers (2021-09-06T14:52:58Z) - Learning Neural Causal Models with Active Interventions [83.44636110899742]
We introduce an active intervention-targeting mechanism which enables a quick identification of the underlying causal structure of the data-generating process.
Our method significantly reduces the required number of interactions compared with random intervention targeting.
We demonstrate superior performance on multiple benchmarks from simulated to real-world data.
arXiv Detail & Related papers (2021-09-06T13:10:37Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Structure Learning for Cyclic Linear Causal Models [5.567377163246147]
We consider the problem of structure learning for linear causal models based on observational data.
We treat models given by possibly cyclic mixed graphs, which allow for feedback loops and effects of latent confounders.
arXiv Detail & Related papers (2020-06-10T17:47:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.