Bayesian Structure Learning with Generative Flow Networks
- URL: http://arxiv.org/abs/2202.13903v1
- Date: Mon, 28 Feb 2022 15:53:10 GMT
- Title: Bayesian Structure Learning with Generative Flow Networks
- Authors: Tristan Deleu, Ant\'onio G\'ois, Chris Emezue, Mansi Rankawat, Simon
Lacoste-Julien, Stefan Bauer, Yoshua Bengio
- Abstract summary: In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
- Score: 85.84396514570373
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In Bayesian structure learning, we are interested in inferring a distribution
over the directed acyclic graph (DAG) structure of Bayesian networks, from
data. Defining such a distribution is very challenging, due to the
combinatorially large sample space, and approximations based on MCMC are often
required. Recently, a novel class of probabilistic models, called Generative
Flow Networks (GFlowNets), have been introduced as a general framework for
generative modeling of discrete and composite objects, such as graphs. In this
work, we propose to use a GFlowNet as an alternative to MCMC for approximating
the posterior distribution over the structure of Bayesian networks, given a
dataset of observations. Generating a sample DAG from this approximate
distribution is viewed as a sequential decision problem, where the graph is
constructed one edge at a time, based on learned transition probabilities.
Through evaluation on both simulated and real data, we show that our approach,
called DAG-GFlowNet, provides an accurate approximation of the posterior over
DAGs, and it compares favorably against other methods based on MCMC or
variational inference.
Related papers
- Self-Supervised Contrastive Graph Clustering Network via Structural Information Fusion [15.293684479404092]
We propose a novel deep graph clustering method called CGCN.
Our approach introduces contrastive signals and deep structural information into the pre-training process.
Our method has been experimentally validated on multiple real-world graph datasets.
arXiv Detail & Related papers (2024-08-08T09:49:26Z) - Fisher Flow Matching for Generative Modeling over Discrete Data [12.69975914345141]
We introduce Fisher-Flow, a novel flow-matching model for discrete data.
Fisher-Flow takes a manifestly geometric perspective by considering categorical distributions over discrete data.
We prove that the gradient flow induced by Fisher-Flow is optimal in reducing the forward KL divergence.
arXiv Detail & Related papers (2024-05-23T15:02:11Z) - Generative Flow Networks: a Markov Chain Perspective [93.9910025411313]
We propose a new perspective for GFlowNets using Markov chains, showing a unifying view for GFlowNets regardless of the nature of the state space.
Positioning GFlowNets under the same theoretical framework as MCMC methods also allows us to identify the similarities between both frameworks.
arXiv Detail & Related papers (2023-07-04T01:28:02Z) - A Bayesian Take on Gaussian Process Networks [1.7188280334580197]
This work implements Monte Carlo and Markov Chain Monte Carlo methods to sample from the posterior distribution of network structures.
We show that our method outperforms state-of-the-art algorithms in recovering the graphical structure of the network.
arXiv Detail & Related papers (2023-06-20T08:38:31Z) - Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Generative Flow Networks for Discrete Probabilistic Modeling [118.81967600750428]
We present energy-based generative flow networks (EB-GFN)
EB-GFN is a novel probabilistic modeling algorithm for high-dimensional discrete data.
We show how GFlowNets can approximately perform large-block Gibbs sampling to mix between modes.
arXiv Detail & Related papers (2022-02-03T01:27:11Z) - Bayesian structure learning and sampling of Bayesian networks with the R
package BiDAG [0.0]
BiDAG implements Markov chain Monte Carlo (MCMC) methods for structure learning and sampling of Bayesian networks.
The package includes tools to search for a maximum a posteriori (MAP) graph and to sample graphs from the posterior distribution given the data.
arXiv Detail & Related papers (2021-05-02T14:42:32Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.