Variational Inference with Continuously-Indexed Normalizing Flows
- URL: http://arxiv.org/abs/2007.05426v2
- Date: Mon, 14 Jun 2021 18:20:21 GMT
- Title: Variational Inference with Continuously-Indexed Normalizing Flows
- Authors: Anthony Caterini and Rob Cornish and Dino Sejdinovic and Arnaud Doucet
- Abstract summary: Continuously-indexed flows (CIFs) have recently achieved improvements over baseline normalizing flows on a variety of density estimation tasks.
We show here how CIFs can be used as part of an auxiliary variational inference scheme to formulate and train expressive posterior approximations.
- Score: 29.95927906900098
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Continuously-indexed flows (CIFs) have recently achieved improvements over
baseline normalizing flows on a variety of density estimation tasks. CIFs do
not possess a closed-form marginal density, and so, unlike standard flows,
cannot be plugged in directly to a variational inference (VI) scheme in order
to produce a more expressive family of approximate posteriors. However, we show
here how CIFs can be used as part of an auxiliary VI scheme to formulate and
train expressive posterior approximations in a natural way. We exploit the
conditional independence structure of multi-layer CIFs to build the required
auxiliary inference models, which we show empirically yield low-variance
estimators of the model evidence. We then demonstrate the advantages of CIFs
over baseline flows in VI problems when the posterior distribution of interest
possesses a complicated topology, obtaining improved results in both the
Bayesian inference and surrogate maximum likelihood settings.
Related papers
- Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - GFlowNets and variational inference [64.22223306224903]
This paper builds bridges between two families of probabilistic algorithms:hierarchical variational inference (VI) and generative flow networks (GFlowNets)
We demonstrate that, in certain cases, VI algorithms are equivalent to special cases of GFlowNets in the sense of equality of expected gradients of their learning objectives.
arXiv Detail & Related papers (2022-10-02T17:41:01Z) - Discretely Indexed Flows [1.0079626733116611]
We propose Discretely Indexed flows (DIF) as a new tool for solving variational estimation problems.
DIF are built as an extension of Normalizing Flows (NF), in which the deterministic transport becomes discretely indexed.
They benefit from both a tractable density as well as a straightforward sampling scheme, and can thus be used for the dual problems of Variational Inference (VI) and of Variational density estimation (VDE)
arXiv Detail & Related papers (2022-04-04T10:13:43Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Regularizing Variational Autoencoder with Diversity and Uncertainty
Awareness [61.827054365139645]
Variational Autoencoder (VAE) approximates the posterior of latent variables based on amortized variational inference.
We propose an alternative model, DU-VAE, for learning a more Diverse and less Uncertain latent space.
arXiv Detail & Related papers (2021-10-24T07:58:13Z) - Attentive Contractive Flow with Lipschitz-constrained Self-Attention [25.84621883831624]
We introduce a novel approach called Attentive Contractive Flow (ACF)
ACF utilizes a special category of flow-based generative models - contractive flows.
We demonstrate that ACF can be introduced into a variety of state of the art flow models in a plug-and-play manner.
arXiv Detail & Related papers (2021-09-24T18:02:49Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.