Invertible Flow Non Equilibrium sampling
- URL: http://arxiv.org/abs/2103.10943v1
- Date: Wed, 17 Mar 2021 09:09:06 GMT
- Title: Invertible Flow Non Equilibrium sampling
- Authors: Achille Thin (CMAP), Yazid Janati (IP Paris, TIPIC-SAMOVAR, CITI),
Sylvain Le Corff (IP Paris, TIPIC-SAMOVAR, CITI), Charles Ollion (CMAP),
Arnaud Doucet, Alain Durmus (CMLA), Eric Moulines (CMAP), Christian Robert
(CEREMADE)
- Abstract summary: Invertible Flow Non Equilibrium Sampling (InFine)
InFine constructs unbiased estimators of expectations and in particular of normalizing constants.
Can be used to construct an Evidence Lower Bound (ELBO) leading to a new class of Variational AutoEncoders (VAE)
- Score: 10.068677972360318
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Simultaneously sampling from a complex distribution with intractable
normalizing constant and approximating expectations under this distribution is
a notoriously challenging problem. We introduce a novel scheme, Invertible Flow
Non Equilibrium Sampling (InFine), which departs from classical Sequential
Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC) approaches. InFine
constructs unbiased estimators of expectations and in particular of normalizing
constants by combining the orbits of a deterministic transform started from
random initializations.When this transform is chosen as an appropriate
integrator of a conformal Hamiltonian system, these orbits are optimization
paths. InFine is also naturally suited to design new MCMC sampling schemes by
selecting samples on the optimization paths.Additionally, InFine can be used to
construct an Evidence Lower Bound (ELBO) leading to a new class of Variational
AutoEncoders (VAE).
Related papers
- Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Reconstructing the Universe with Variational self-Boosted Sampling [7.922637707393503]
Traditional algorithms such as Hamiltonian Monte Carlo (HMC) are computationally inefficient due to generating correlated samples.
Here we develop a hybrid scheme called variational self-boosted sampling (VBS) to mitigate the drawbacks of both algorithms.
VBS generates better quality of samples than simple VI approaches and reduces the correlation length in the sampling phase by a factor of 10-50 over using only HMC.
arXiv Detail & Related papers (2022-06-28T21:30:32Z) - Continual Repeated Annealed Flow Transport Monte Carlo [93.98285297760671]
We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT)
It combines a sequential Monte Carlo sampler with variational inference using normalizing flows.
We show that CRAFT can achieve impressively accurate results on a lattice field example.
arXiv Detail & Related papers (2022-01-31T10:58:31Z) - Stochastic Normalizing Flows for Inverse Problems: a Markov Chains
Viewpoint [0.45119235878273]
We consider normalizing flows from a Markov chain point of view.
We replace transition densities by general Markov kernels and establish proofs via Radon-Nikodym derivatives.
The performance of the proposed conditional normalizing flow is demonstrated by numerical examples.
arXiv Detail & Related papers (2021-09-23T13:44:36Z) - Deterministic Gibbs Sampling via Ordinary Differential Equations [77.42706423573573]
This paper presents a general construction of deterministic measure-preserving dynamics using autonomous ODEs and tools from differential geometry.
We show how Hybrid Monte Carlo and other deterministic samplers follow as special cases of our theory.
arXiv Detail & Related papers (2021-06-18T15:36:09Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Efficient MCMC Sampling for Bayesian Matrix Factorization by Breaking
Posterior Symmetries [1.3858051019755282]
We propose a simple modification to the prior choice that provably breaks these symmetries and maintains/improves accuracy.
We show that using non-zero linearly independent prior means significantly lowers the autocorrelation of MCMC samples, and can also lead to lower reconstruction errors.
arXiv Detail & Related papers (2020-06-08T00:25:48Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.