DAG-WGAN: Causal Structure Learning With Wasserstein Generative
Adversarial Networks
- URL: http://arxiv.org/abs/2204.00387v1
- Date: Fri, 1 Apr 2022 12:27:27 GMT
- Title: DAG-WGAN: Causal Structure Learning With Wasserstein Generative
Adversarial Networks
- Authors: Hristo Petkov, Colin Hanley and Feng Dong
- Abstract summary: This paper proposes DAG-WGAN, which combines the Wasserstein-based adversarial loss, an auto-encoder architecture together with an acyclicity constraint.
It simultaneously learns causal structures and improves its data generation capability by leveraging the strength from the Wasserstein distance metric.
Our experiments have evaluated DAG-WGAN against the state-of-the-art and demonstrated its good performance.
- Score: 2.492300648514129
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The combinatorial search space presents a significant challenge to learning
causality from data. Recently, the problem has been formulated into a
continuous optimization framework with an acyclicity constraint, allowing for
the exploration of deep generative models to better capture data sample
distributions and support the discovery of Directed Acyclic Graphs (DAGs) that
faithfully represent the underlying data distribution. However, so far no study
has investigated the use of Wasserstein distance for causal structure learning
via generative models. This paper proposes a new model named DAG-WGAN, which
combines the Wasserstein-based adversarial loss, an auto-encoder architecture
together with an acyclicity constraint. DAG-WGAN simultaneously learns causal
structures and improves its data generation capability by leveraging the
strength from the Wasserstein distance metric. Compared with other models, it
scales well and handles both continuous and discrete data. Our experiments have
evaluated DAG-WGAN against the state-of-the-art and demonstrated its good
performance.
Related papers
- SeaDAG: Semi-autoregressive Diffusion for Conditional Directed Acyclic Graph Generation [83.52157311471693]
We introduce SeaDAG, a semi-autoregressive diffusion model for conditional generation of Directed Acyclic Graphs (DAGs)
Unlike conventional autoregressive generation that lacks a global graph structure view, our method maintains a complete graph structure at each diffusion step.
We explicitly train the model to learn graph conditioning with a condition loss, which enhances the diffusion model's capacity to generate realistic DAGs.
arXiv Detail & Related papers (2024-10-21T15:47:03Z) - Tree Search in DAG Space with Model-based Reinforcement Learning for
Causal Discovery [6.772856304452474]
CD-UCT is a model-based reinforcement learning method for causal discovery based on tree search.
We formalize and prove the correctness of an efficient algorithm for excluding edges that would introduce cycles.
The proposed method can be applied broadly to causal Bayesian networks with both discrete and continuous random variables.
arXiv Detail & Related papers (2023-10-20T15:14:18Z) - Discovering Dynamic Causal Space for DAG Structure Learning [64.763763417533]
We propose a dynamic causal space for DAG structure learning, coined CASPER.
It integrates the graph structure into the score function as a new measure in the causal space to faithfully reflect the causal distance between estimated and ground truth DAG.
arXiv Detail & Related papers (2023-06-05T12:20:40Z) - Causality Learning With Wasserstein Generative Adversarial Networks [2.492300648514129]
A model named DAG-WGAN combines the Wasserstein-based adversarial loss with an acyclicity constraint in an auto-encoder architecture.
It simultaneously learns causal structures while improving its data generation capability.
We compare the performance of DAG-WGAN with other models that do not involve the Wasserstein metric in order to identify its contribution to causal structure learning.
arXiv Detail & Related papers (2022-06-03T10:45:47Z) - Amortized Inference for Causal Structure Learning [72.84105256353801]
Learning causal structure poses a search problem that typically involves evaluating structures using a score or independence test.
We train a variational inference model to predict the causal structure from observational/interventional data.
Our models exhibit robust generalization capabilities under substantial distribution shift.
arXiv Detail & Related papers (2022-05-25T17:37:08Z) - DATGAN: Integrating expert knowledge into deep learning for synthetic
tabular data [0.0]
Synthetic data can be used in various applications, such as correcting bias datasets or replacing scarce original data for simulation purposes.
Deep learning models are data-driven and it is difficult to control the generation process.
This article presents the Directed Acyclic Tabular GAN ( DATGAN) to address these limitations.
arXiv Detail & Related papers (2022-03-07T16:09:03Z) - BCDAG: An R package for Bayesian structure and Causal learning of
Gaussian DAGs [77.34726150561087]
We introduce the R package for causal discovery and causal effect estimation from observational data.
Our implementation scales efficiently with the number of observations and, whenever the DAGs are sufficiently sparse, the number of variables in the dataset.
We then illustrate the main functions and algorithms on both real and simulated datasets.
arXiv Detail & Related papers (2022-01-28T09:30:32Z) - Federated Causal Discovery [74.37739054932733]
This paper develops a gradient-based learning framework named DAG-Shared Federated Causal Discovery (DS-FCD)
It can learn the causal graph without directly touching local data and naturally handle the data heterogeneity.
Extensive experiments on both synthetic and real-world datasets verify the efficacy of the proposed method.
arXiv Detail & Related papers (2021-12-07T08:04:12Z) - BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery [97.79015388276483]
A structural equation model (SEM) is an effective framework to reason over causal relationships represented via a directed acyclic graph (DAG)
Recent advances enabled effective maximum-likelihood point estimation of DAGs from observational data.
We propose BCD Nets, a variational framework for estimating a distribution over DAGs characterizing a linear-Gaussian SEM.
arXiv Detail & Related papers (2021-12-06T03:35:21Z) - Disentangled Recurrent Wasserstein Autoencoder [17.769077848342334]
recurrent Wasserstein Autoencoder (R-WAE) is a new framework for generative modeling of sequential data.
R-WAE disentangles the representation of an input sequence into static and dynamic factors.
Our models outperform other baselines with the same settings in terms of disentanglement and unconditional video generation.
arXiv Detail & Related papers (2021-01-19T07:43:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.