Stochastic Normalizing Flows for Inverse Problems: a Markov Chains
Viewpoint
- URL: http://arxiv.org/abs/2109.11375v1
- Date: Thu, 23 Sep 2021 13:44:36 GMT
- Title: Stochastic Normalizing Flows for Inverse Problems: a Markov Chains
Viewpoint
- Authors: Paul Hagemann, Johannes Hertrich, Gabriele Steidl
- Abstract summary: We consider normalizing flows from a Markov chain point of view.
We replace transition densities by general Markov kernels and establish proofs via Radon-Nikodym derivatives.
The performance of the proposed conditional normalizing flow is demonstrated by numerical examples.
- Score: 0.45119235878273
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To overcome topological constraints and improve the expressiveness of
normalizing flow architectures, Wu, K\"ohler and No\'e introduced stochastic
normalizing flows which combine deterministic, learnable flow transformations
with stochastic sampling methods. In this paper, we consider stochastic
normalizing flows from a Markov chain point of view. In particular, we replace
transition densities by general Markov kernels and establish proofs via
Radon-Nikodym derivatives which allows to incorporate distributions without
densities in a sound way. Further, we generalize the results for sampling from
posterior distributions as required in inverse problems. The performance of the
proposed conditional stochastic normalizing flow is demonstrated by numerical
examples.
Related papers
- Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Detecting and Mitigating Mode-Collapse for Flow-based Sampling of
Lattice Field Theories [6.222204646855336]
We study the consequences of mode-collapse of normalizing flows in the context of lattice field theory.
We propose a metric to quantify the degree of mode-collapse and derive a bound on the resulting bias.
arXiv Detail & Related papers (2023-02-27T19:00:22Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Resampling Base Distributions of Normalizing Flows [0.0]
We introduce a base distribution for normalizing flows based on learned rejection sampling.
We develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence.
arXiv Detail & Related papers (2021-10-29T14:44:44Z) - Invertible Flow Non Equilibrium sampling [10.068677972360318]
Invertible Flow Non Equilibrium Sampling (InFine)
InFine constructs unbiased estimators of expectations and in particular of normalizing constants.
Can be used to construct an Evidence Lower Bound (ELBO) leading to a new class of Variational AutoEncoders (VAE)
arXiv Detail & Related papers (2021-03-17T09:09:06Z) - Integrable Nonparametric Flows [5.9774834479750805]
We introduce a method for reconstructing an infinitesimal normalizing flow given only an infinitesimal change to a probability distribution.
This reverses the conventional task of normalizing flows.
We discuss potential applications to problems in quantum Monte Carlo and machine learning.
arXiv Detail & Related papers (2020-12-03T16:19:52Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Composing Normalizing Flows for Inverse Problems [89.06155049265641]
We propose a framework for approximate inference that estimates the target conditional as a composition of two flow models.
Our method is evaluated on a variety of inverse problems and is shown to produce high-quality samples with uncertainty.
arXiv Detail & Related papers (2020-02-26T19:01:11Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.