Learning Discrete Directed Acyclic Graphs via Backpropagation
- URL: http://arxiv.org/abs/2210.15353v1
- Date: Thu, 27 Oct 2022 12:03:55 GMT
- Title: Learning Discrete Directed Acyclic Graphs via Backpropagation
- Authors: Andrew J. Wren, Pasquale Minervini, Luca Franceschi and Valentina
Zantedeschi
- Abstract summary: Recently continuous relaxations have been proposed in order to learn Directed Acyclic Graphs (DAGs) from data by backpropagation.
We propose DAG-DB, a framework for learning DAGs by Discrete Backpropagation.
- Score: 16.823075878437493
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently continuous relaxations have been proposed in order to learn Directed
Acyclic Graphs (DAGs) from data by backpropagation, instead of using
combinatorial optimization. However, a number of techniques for fully discrete
backpropagation could instead be applied. In this paper, we explore that
direction and propose DAG-DB, a framework for learning DAGs by Discrete
Backpropagation. Based on the architecture of Implicit Maximum Likelihood
Estimation [I-MLE, arXiv:2106.01798], DAG-DB adopts a probabilistic approach to
the problem, sampling binary adjacency matrices from an implicit probability
distribution. DAG-DB learns a parameter for the distribution from the loss
incurred by each sample, performing competitively using either of two fully
discrete backpropagation techniques, namely I-MLE and Straight-Through
Estimation.
Related papers
- Induced Covariance for Causal Discovery in Linear Sparse Structures [55.2480439325792]
Causal models seek to unravel the cause-effect relationships among variables from observed data.
This paper introduces a novel causal discovery algorithm designed for settings in which variables exhibit linearly sparse relationships.
arXiv Detail & Related papers (2024-10-02T04:01:38Z) - Scalable Variational Causal Discovery Unconstrained by Acyclicity [6.954510776782872]
We propose a scalable Bayesian approach to learn the posterior distribution over causal graphs given observational data.
We introduce a novel differentiable DAG sampling method that can generate a valid acyclic causal graph.
We are able to model the posterior distribution over causal graphs using a simple variational distribution over a continuous domain.
arXiv Detail & Related papers (2024-07-06T07:56:23Z) - ProDAG: Projection-Induced Variational Inference for Directed Acyclic Graphs [8.556906995059324]
Directed acyclic graph (DAG) learning is a rapidly expanding field of research.
It remains statistically and computationally challenging to learn a single (point estimate) DAG from data, let alone provide uncertainty quantification.
Our article addresses the difficult task of quantifying graph uncertainty by developing a Bayesian variational inference framework based on novel distributions that have support directly on the space of DAGs.
arXiv Detail & Related papers (2024-05-24T03:04:28Z) - Collaborative Heterogeneous Causal Inference Beyond Meta-analysis [68.4474531911361]
We propose a collaborative inverse propensity score estimator for causal inference with heterogeneous data.
Our method shows significant improvements over the methods based on meta-analysis when heterogeneity increases.
arXiv Detail & Related papers (2024-04-24T09:04:36Z) - Synergistic eigenanalysis of covariance and Hessian matrices for enhanced binary classification [72.77513633290056]
We present a novel approach that combines the eigenanalysis of a covariance matrix evaluated on a training set with a Hessian matrix evaluated on a deep learning model.
Our method captures intricate patterns and relationships, enhancing classification performance.
arXiv Detail & Related papers (2024-02-14T16:10:42Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - BayesDAG: Gradient-Based Posterior Inference for Causal Discovery [30.027520859604955]
We introduce a scalable causal discovery framework based on a combination of Markov Chain Monte Carlo and Variational Inference.
Our approach directly samples DAGs from the posterior without requiring any DAG regularization.
We derive a novel equivalence to the permutation-based DAG learning, which opens up possibilities of using any relaxed estimator defined over permutations.
arXiv Detail & Related papers (2023-07-26T02:34:13Z) - BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery [97.79015388276483]
A structural equation model (SEM) is an effective framework to reason over causal relationships represented via a directed acyclic graph (DAG)
Recent advances enabled effective maximum-likelihood point estimation of DAGs from observational data.
We propose BCD Nets, a variational framework for estimating a distribution over DAGs characterizing a linear-Gaussian SEM.
arXiv Detail & Related papers (2021-12-06T03:35:21Z) - Learning linear non-Gaussian directed acyclic graph with diverging
number of nodes [12.49848873864773]
Acyclic model, often depicted as a directed acyclic graph (DAG), has been widely employed to represent directional causal relations among collected nodes.
We propose an efficient method to learn linear non-Gaussian DAG in high dimensional cases, where the noises can be of any continuous non-Gaussian distribution.
arXiv Detail & Related papers (2021-11-01T07:34:53Z) - DAGs with No Curl: An Efficient DAG Structure Learning Approach [62.885572432958504]
Recently directed acyclic graph (DAG) structure learning is formulated as a constrained continuous optimization problem with continuous acyclicity constraints.
We propose a novel learning framework to model and learn the weighted adjacency matrices in the DAG space directly.
We show that our method provides comparable accuracy but better efficiency than baseline DAG structure learning methods on both linear and generalized structural equation models.
arXiv Detail & Related papers (2021-06-14T07:11:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.