Differentiable DAG Sampling
- URL: http://arxiv.org/abs/2203.08509v1
- Date: Wed, 16 Mar 2022 10:14:49 GMT
- Title: Differentiable DAG Sampling
- Authors: Bertrand Charpentier, Simon Kibler, Stephan G\"unnemann
- Abstract summary: We propose a new differentiable probabilistic model over DAGs (DP-DAG)
DP-DAG allows fast and differentiable DAG sampling suited to continuous optimization.
We propose VI-DP-DAG, a new method for DAG learning from observational data.
- Score: 33.45069308137142
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new differentiable probabilistic model over DAGs (DP-DAG).
DP-DAG allows fast and differentiable DAG sampling suited to continuous
optimization. To this end, DP-DAG samples a DAG by successively (1) sampling a
linear ordering of the node and (2) sampling edges consistent with the sampled
linear ordering. We further propose VI-DP-DAG, a new method for DAG learning
from observational data which combines DP-DAG with variational inference.
Hence,VI-DP-DAG approximates the posterior probability over DAG edges given the
observed data. VI-DP-DAG is guaranteed to output a valid DAG at any time during
training and does not require any complex augmented Lagrangian optimization
scheme in contrast to existing differentiable DAG learning approaches. In our
extensive experiments, we compare VI-DP-DAG to other differentiable DAG
learning baselines on synthetic and real datasets. VI-DP-DAG significantly
improves DAG structure and causal mechanism learning while training faster than
competitors.
Related papers
- LayerDAG: A Layerwise Autoregressive Diffusion Model for Directed Acyclic Graph Generation [17.94316378710172]
This paper introduces LayerDAG, an autoregressive diffusion model, to generate realistic directed acyclic graphs (DAGs)
By interpreting the partial order of nodes as a sequence of bipartite graphs, LayerDAG decouples the strong node dependencies into manageable units that can be processed sequentially.
Experiments on both synthetic and real-world flow graphs from various computing platforms show that LayerDAG generates valid DAGs with superior statistical properties and benchmarking performance.
arXiv Detail & Related papers (2024-11-04T17:47:15Z) - Graph Adversarial Diffusion Convolution [49.974206213411904]
This paper introduces a min-max optimization formulation for the Graph Signal Denoising (GSD) problem.
We derive a new variant of the Graph Diffusion Convolution architecture, called Graph Adversarial Diffusion Convolution (GADC)
arXiv Detail & Related papers (2024-06-04T07:43:04Z) - Convolutional Learning on Directed Acyclic Graphs [10.282099295800322]
We develop a novel convolutional architecture tailored for learning from data defined over directed acyclic graphs (DAGs)
We develop a novel convolutional graph neural network that integrates learnable DAG filters to account for the partial ordering induced by the graph topology.
arXiv Detail & Related papers (2024-05-05T21:30:18Z) - GE-AdvGAN: Improving the transferability of adversarial samples by
gradient editing-based adversarial generative model [69.71629949747884]
Adversarial generative models, such as Generative Adversarial Networks (GANs), are widely applied for generating various types of data.
In this work, we propose a novel algorithm named GE-AdvGAN to enhance the transferability of adversarial samples.
arXiv Detail & Related papers (2024-01-11T16:43:16Z) - Structure Learning with Adaptive Random Neighborhood Informed MCMC [0.0]
We introduce a novel MCMC sampler, PARNI-DAG, for the problem of structure learning under observational data.
Under the assumption of causal sufficiency, the algorithm allows for approximate sampling directly from the posterior distribution on Directed Acyclic Graphs (DAGs)
We empirically demonstrate its mixing efficiency and accuracy in learning DAG structures on a variety of experiments.
arXiv Detail & Related papers (2023-11-01T15:47:18Z) - Structural transfer learning of non-Gaussian DAG [24.11895013147964]
Directed acyclic graph (DAG) has been widely employed to represent directional relationships among a set of collected nodes.
It remains an open question how to pool the heterogeneous data together for better DAG structure reconstruction in the target study.
We introduce a novel set of structural similarity measures for DAG and then present a transfer DAG learning framework.
arXiv Detail & Related papers (2023-10-16T10:01:27Z) - Learning Better with Less: Effective Augmentation for Sample-Efficient
Visual Reinforcement Learning [57.83232242068982]
Data augmentation (DA) is a crucial technique for enhancing the sample efficiency of visual reinforcement learning (RL) algorithms.
It remains unclear which attributes of DA account for its effectiveness in achieving sample-efficient visual RL.
This work conducts comprehensive experiments to assess the impact of DA's attributes on its efficacy.
arXiv Detail & Related papers (2023-05-25T15:46:20Z) - DAPDAG: Domain Adaptation via Perturbed DAG Reconstruction [78.76115370275733]
We learn an auto-encoder that undertakes inference on population statistics given features and reconstruct a directed acyclic graph (DAG) as an auxiliary task.
The underlying DAG structure is assumed invariant among observed variables whose conditional distributions are allowed to vary across domains led by a latent environmental variable $E$.
We train the encoder and decoder jointly in an end-to-end manner and conduct experiments on synthetic and real datasets with mixed variables.
arXiv Detail & Related papers (2022-08-02T11:43:03Z) - BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery [97.79015388276483]
A structural equation model (SEM) is an effective framework to reason over causal relationships represented via a directed acyclic graph (DAG)
Recent advances enabled effective maximum-likelihood point estimation of DAGs from observational data.
We propose BCD Nets, a variational framework for estimating a distribution over DAGs characterizing a linear-Gaussian SEM.
arXiv Detail & Related papers (2021-12-06T03:35:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.