DiBS: Differentiable Bayesian Structure Learning
- URL: http://arxiv.org/abs/2105.11839v1
- Date: Tue, 25 May 2021 11:23:08 GMT
- Title: DiBS: Differentiable Bayesian Structure Learning
- Authors: Lars Lorch, Jonas Rothfuss, Bernhard Sch\"olkopf, Andreas Krause
- Abstract summary: We propose a general, fully differentiable framework for Bayesian structure learning (DiBS)
DiBS operates in the continuous space of a latent probabilistic graph representation.
Contrary to existing work, DiBS is agnostic to the form of the local conditional distributions.
- Score: 38.01659425023988
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian structure learning allows inferring Bayesian network structure from
data while reasoning about the epistemic uncertainty -- a key element towards
enabling active causal discovery and designing interventions in real world
systems. In this work, we propose a general, fully differentiable framework for
Bayesian structure learning (DiBS) that operates in the continuous space of a
latent probabilistic graph representation. Building on recent advances in
variational inference, we use DiBS to devise an efficient method for
approximating posteriors over structural models. Contrary to existing work,
DiBS is agnostic to the form of the local conditional distributions and allows
for joint posterior inference of both the graph structure and the conditional
distribution parameters. This makes our method directly applicable to posterior
inference of nonstandard Bayesian network models, e.g., with nonlinear
dependencies encoded by neural networks. In evaluations on simulated and
real-world data, DiBS significantly outperforms related approaches to joint
posterior inference.
Related papers
- Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Variational Inference for Bayesian Neural Networks under Model and
Parameter Uncertainty [12.211659310564425]
We apply the concept of model uncertainty as a framework for structural learning in BNNs.
We suggest an adaptation of a scalable variational inference approach with reparametrization of marginal inclusion probabilities.
arXiv Detail & Related papers (2023-05-01T16:38:17Z) - Bayesian learning of Causal Structure and Mechanisms with GFlowNets and Variational Bayes [51.84122462615402]
We introduce a novel method to learn the structure and mechanisms of the causal model using Variational Bayes-DAG-GFlowNet.
We extend the method of Bayesian causal structure learning using GFlowNets to learn the parameters of a linear-Gaussian model.
arXiv Detail & Related papers (2022-11-04T21:57:39Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - BCDAG: An R package for Bayesian structure and Causal learning of
Gaussian DAGs [77.34726150561087]
We introduce the R package for causal discovery and causal effect estimation from observational data.
Our implementation scales efficiently with the number of observations and, whenever the DAGs are sufficiently sparse, the number of variables in the dataset.
We then illustrate the main functions and algorithms on both real and simulated datasets.
arXiv Detail & Related papers (2022-01-28T09:30:32Z) - Hybrid Bayesian network discovery with latent variables by scoring
multiple interventions [5.994412766684843]
We present the hybrid mFGS-BS (majority rule and Fast Greedy equivalence Search with Bayesian Scoring) algorithm for structure learning from discrete data.
The algorithm assumes causal insufficiency in the presence of latent variables and produces a Partial Ancestral Graph (PAG)
Experimental results show that mFGS-BS improves structure learning accuracy relative to the state-of-the-art and it is computationally efficient.
arXiv Detail & Related papers (2021-12-20T14:54:41Z) - Prequential MDL for Causal Structure Learning with Neural Networks [9.669269791955012]
We show that the prequential minimum description length principle can be used to derive a practical scoring function for Bayesian networks.
We obtain plausible and parsimonious graph structures without relying on sparsity inducing priors or other regularizers which must be tuned.
We discuss how the the prequential score relates to recent work that infers causal structure from the speed of adaptation when the observations come from a source undergoing distributional shift.
arXiv Detail & Related papers (2021-07-02T22:35:21Z) - Structural Causal Models Are (Solvable by) Credal Networks [70.45873402967297]
Causal inferences can be obtained by standard algorithms for the updating of credal nets.
This contribution should be regarded as a systematic approach to represent structural causal models by credal networks.
Experiments show that approximate algorithms for credal networks can immediately be used to do causal inference in real-size problems.
arXiv Detail & Related papers (2020-08-02T11:19:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.