Projected Latent Markov Chain Monte Carlo: Conditional Sampling of
Normalizing Flows
- URL: http://arxiv.org/abs/2007.06140v4
- Date: Fri, 26 Feb 2021 16:52:28 GMT
- Title: Projected Latent Markov Chain Monte Carlo: Conditional Sampling of
Normalizing Flows
- Authors: Chris Cannella, Mohammadreza Soltani, Vahid Tarokh
- Abstract summary: Projected Latent Markov Chain Monte Carlo (PL-MCMC) is a technique for sampling from the high-dimensional conditional distributions learned by a normalizing flow.
As a conditional sampling method, PL-MCMC enables Monte Carlo Expectation Maximization (MC-EM) training of normalizing flows from incomplete data.
- Score: 37.87437571724747
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Projected Latent Markov Chain Monte Carlo (PL-MCMC), a technique
for sampling from the high-dimensional conditional distributions learned by a
normalizing flow. We prove that a Metropolis-Hastings implementation of PL-MCMC
asymptotically samples from the exact conditional distributions associated with
a normalizing flow. As a conditional sampling method, PL-MCMC enables Monte
Carlo Expectation Maximization (MC-EM) training of normalizing flows from
incomplete data. Through experimental tests applying normalizing flows to
missing data tasks for a variety of data sets, we demonstrate the efficacy of
PL-MCMC for conditional sampling from normalizing flows.
Related papers
- Reverse Diffusion Monte Carlo [19.35592726471155]
We propose a novel Monte Carlo sampling algorithm called reverse diffusion Monte Carlo (rdMC)
rdMC is distinct from the Markov chain Monte Carlo (MCMC) methods.
arXiv Detail & Related papers (2023-07-05T05:42:03Z) - Detecting and Mitigating Mode-Collapse for Flow-based Sampling of
Lattice Field Theories [6.222204646855336]
We study the consequences of mode-collapse of normalizing flows in the context of lattice field theory.
We propose a metric to quantify the degree of mode-collapse and derive a bound on the resulting bias.
arXiv Detail & Related papers (2023-02-27T19:00:22Z) - Continual Repeated Annealed Flow Transport Monte Carlo [93.98285297760671]
We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT)
It combines a sequential Monte Carlo sampler with variational inference using normalizing flows.
We show that CRAFT can achieve impressively accurate results on a lattice field example.
arXiv Detail & Related papers (2022-01-31T10:58:31Z) - Deterministic Gibbs Sampling via Ordinary Differential Equations [77.42706423573573]
This paper presents a general construction of deterministic measure-preserving dynamics using autonomous ODEs and tools from differential geometry.
We show how Hybrid Monte Carlo and other deterministic samplers follow as special cases of our theory.
arXiv Detail & Related papers (2021-06-18T15:36:09Z) - Invertible Flow Non Equilibrium sampling [10.068677972360318]
Invertible Flow Non Equilibrium Sampling (InFine)
InFine constructs unbiased estimators of expectations and in particular of normalizing constants.
Can be used to construct an Evidence Lower Bound (ELBO) leading to a new class of Variational AutoEncoders (VAE)
arXiv Detail & Related papers (2021-03-17T09:09:06Z) - Annealed Flow Transport Monte Carlo [91.20263039913912]
Annealed Flow Transport (AFT) builds upon Annealed Importance Sampling (AIS) and Sequential Monte Carlo (SMC)
AFT relies on NF which is learned sequentially to push particles towards the successive targets.
We show that a continuous-time scaling limit of the population version of AFT is given by a Feynman--Kac measure.
arXiv Detail & Related papers (2021-02-15T12:05:56Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.