Stochastic Normalizing Flows
- URL: http://arxiv.org/abs/2002.06707v3
- Date: Mon, 26 Oct 2020 11:28:47 GMT
- Title: Stochastic Normalizing Flows
- Authors: Hao Wu, Jonas K\"ohler and Frank No\'e
- Abstract summary: We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
- Score: 2.323220706791067
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The sampling of probability distributions specified up to a normalization
constant is an important problem in both machine learning and statistical
mechanics. While classical stochastic sampling methods such as Markov Chain
Monte Carlo (MCMC) or Langevin Dynamics (LD) can suffer from slow mixing times
there is a growing interest in using normalizing flows in order to learn the
transformation of a simple prior distribution to the given target distribution.
Here we propose a generalized and combined approach to sample target densities:
Stochastic Normalizing Flows (SNF) -- an arbitrary sequence of deterministic
invertible functions and stochastic sampling blocks. We show that stochasticity
overcomes expressivity limitations of normalizing flows resulting from the
invertibility constraint, whereas trainable transformations between sampling
steps improve efficiency of pure MCMC/LD along the flow. By invoking ideas from
non-equilibrium statistical mechanics we derive an efficient training procedure
by which both the sampler's and the flow's parameters can be optimized
end-to-end, and by which we can compute exact importance weights without having
to marginalize out the randomness of the stochastic blocks. We illustrate the
representational power, sampling efficiency and asymptotic correctness of SNFs
on several benchmarks including applications to sampling molecular systems in
equilibrium.
Related papers
- Asymptotically Optimal Change Detection for Unnormalized Pre- and Post-Change Distributions [65.38208224389027]
This paper addresses the problem of detecting changes when only unnormalized pre- and post-change distributions are accessible.
Our approach is based on the estimation of the Cumulative Sum statistics, which is known to produce optimal performance.
arXiv Detail & Related papers (2024-10-18T17:13:29Z) - NETS: A Non-Equilibrium Transport Sampler [15.58993313831079]
We propose an algorithm, termed the Non-Equilibrium Transport Sampler (NETS)
NETS can be viewed as a variant of importance sampling (AIS) based on Jarzynski's equality.
We show that this drift is the minimizer of a variety of objective functions, which can all be estimated in an unbiased fashion.
arXiv Detail & Related papers (2024-10-03T17:35:38Z) - Model-Free Stochastic Process Modeling and Optimization using Normalizing Flows [0.0]
This work proposes using conditional normalizing flows as discrete-time models to learn the dynamics of chemical processes.
The normalizing flow yields stable simulations over long time horizons and high-quality results in probabilistic and MPC formulation for open-loop control.
arXiv Detail & Related papers (2024-09-26T08:28:14Z) - Weak Generative Sampler to Efficiently Sample Invariant Distribution of Stochastic Differential Equation [8.67581853745823]
Current deep learning-based method solves the stationary Fokker--Planck equation to determine the invariant probability density function in form of deep neural networks.
We introduce a framework that employs a weak generative sampler (WGS) to directly generate independent and identically distributed (iid) samples.
Our proposed loss function is based on the weak form of the Fokker--Planck equation, integrating normalizing flows to characterize the invariant distribution.
arXiv Detail & Related papers (2024-05-29T16:41:42Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Equivariant flow matching [0.9208007322096533]
We introduce equivariant flow matching, a new training objective for equivariant continuous normalizing flows (CNFs)
Equivariant flow matching exploits the physical symmetries of the target energy for efficient, simulation-free training of equivariant CNFs.
Our results show that the equivariant flow matching objective yields flows with shorter integration paths, improved sampling efficiency, and higher scalability compared to existing methods.
arXiv Detail & Related papers (2023-06-26T19:40:10Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Stochastic Normalizing Flows for Inverse Problems: a Markov Chains
Viewpoint [0.45119235878273]
We consider normalizing flows from a Markov chain point of view.
We replace transition densities by general Markov kernels and establish proofs via Radon-Nikodym derivatives.
The performance of the proposed conditional normalizing flow is demonstrated by numerical examples.
arXiv Detail & Related papers (2021-09-23T13:44:36Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.