GFlowNets and variational inference
- URL: http://arxiv.org/abs/2210.00580v1
- Date: Sun, 2 Oct 2022 17:41:01 GMT
- Title: GFlowNets and variational inference
- Authors: Nikolay Malkin, Salem Lahlou, Tristan Deleu, Xu Ji, Edward Hu, Katie
Everett, Dinghuai Zhang, Yoshua Bengio
- Abstract summary: This paper builds bridges between two families of probabilistic algorithms:hierarchical variational inference (VI) and generative flow networks (GFlowNets)
We demonstrate that, in certain cases, VI algorithms are equivalent to special cases of GFlowNets in the sense of equality of expected gradients of their learning objectives.
- Score: 64.22223306224903
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper builds bridges between two families of probabilistic algorithms:
(hierarchical) variational inference (VI), which is typically used to model
distributions over continuous spaces, and generative flow networks (GFlowNets),
which have been used for distributions over discrete structures such as graphs.
We demonstrate that, in certain cases, VI algorithms are equivalent to special
cases of GFlowNets in the sense of equality of expected gradients of their
learning objectives. We then point out the differences between the two families
and show how these differences emerge experimentally. Notably, GFlowNets, which
borrow ideas from reinforcement learning, are more amenable than VI to
off-policy training without the cost of high gradient variance induced by
importance sampling. We argue that this property of GFlowNets can provide
advantages for capturing diversity in multimodal target distributions.
Related papers
- Stochastic Generative Flow Networks [89.34644133901647]
Generative Flow Networks (or GFlowNets) learn to sample complex structures through the lens of "inference as control"
Existing GFlowNets can be applied only to deterministic environments, and fail in more general tasks with dynamics.
This paper introduces GFlowNets, a new algorithm that extends GFlowNets to environments.
arXiv Detail & Related papers (2023-02-19T03:19:40Z) - Distributional GFlowNets with Quantile Flows [73.73721901056662]
Generative Flow Networks (GFlowNets) are a new family of probabilistic samplers where an agent learns a policy for generating complex structure through a series of decision-making steps.
In this work, we adopt a distributional paradigm for GFlowNets, turning each flow function into a distribution, thus providing more informative learning signals during training.
Our proposed textitquantile matching GFlowNet learning algorithm is able to learn a risk-sensitive policy, an essential component for handling scenarios with risk uncertainty.
arXiv Detail & Related papers (2023-02-11T22:06:17Z) - A theory of continuous generative flow networks [104.93913776866195]
Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions.
We present a theory for generalized GFlowNets, which encompasses both existing discrete GFlowNets and ones with continuous or hybrid state spaces.
arXiv Detail & Related papers (2023-01-30T00:37:56Z) - Learning GFlowNets from partial episodes for improved convergence and
stability [56.99229746004125]
Generative flow networks (GFlowNets) are algorithms for training a sequential sampler of discrete objects under an unnormalized target density.
Existing training objectives for GFlowNets are either local to states or transitions, or propagate a reward signal over an entire sampling trajectory.
Inspired by the TD($lambda$) algorithm in reinforcement learning, we introduce subtrajectory balance or SubTB($lambda$), a GFlowNet training objective that can learn from partial action subsequences of varying lengths.
arXiv Detail & Related papers (2022-09-26T15:44:24Z) - GFlowNet Foundations [66.69854262276391]
Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context.
We show a number of additional theoretical properties of GFlowNets.
arXiv Detail & Related papers (2021-11-17T17:59:54Z) - Variational Inference with Continuously-Indexed Normalizing Flows [29.95927906900098]
Continuously-indexed flows (CIFs) have recently achieved improvements over baseline normalizing flows on a variety of density estimation tasks.
We show here how CIFs can be used as part of an auxiliary variational inference scheme to formulate and train expressive posterior approximations.
arXiv Detail & Related papers (2020-07-10T15:00:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.