Combining Normalizing Flows and Quasi-Monte Carlo
- URL: http://arxiv.org/abs/2401.05934v1
- Date: Thu, 11 Jan 2024 14:17:06 GMT
- Title: Combining Normalizing Flows and Quasi-Monte Carlo
- Authors: Charly Andral
- Abstract summary: Recent advances in machine learning have led to the development of new methods for enhancing Monte Carlo methods.
We demonstrate through numerical experiments that this combination can lead to an estimator with significantly lower variance than if the flow was sampled with a classic Monte Carlo.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent advances in machine learning have led to the development of new
methods for enhancing Monte Carlo methods such as Markov chain Monte Carlo
(MCMC) and importance sampling (IS). One such method is normalizing flows,
which use a neural network to approximate a distribution by evaluating it
pointwise. Normalizing flows have been shown to improve the performance of MCMC
and IS. On the other side, (randomized) quasi-Monte Carlo methods are used to
perform numerical integration. They replace the random sampling of Monte Carlo
by a sequence which cover the hypercube more uniformly, resulting in better
convergence rates for the error that plain Monte Carlo. In this work, we
combine these two methods by using quasi-Monte Carlo to sample the initial
distribution that is transported by the flow. We demonstrate through numerical
experiments that this combination can lead to an estimator with significantly
lower variance than if the flow was sampled with a classic Monte Carlo.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Reverse Diffusion Monte Carlo [19.35592726471155]
We propose a novel Monte Carlo sampling algorithm called reverse diffusion Monte Carlo (rdMC)
rdMC is distinct from the Markov chain Monte Carlo (MCMC) methods.
arXiv Detail & Related papers (2023-07-05T05:42:03Z) - Low-variance estimation in the Plackett-Luce model via quasi-Monte Carlo
sampling [58.14878401145309]
We develop a novel approach to producing more sample-efficient estimators of expectations in the PL model.
We illustrate our findings both theoretically and empirically using real-world recommendation data from Amazon Music and the Yahoo learning-to-rank challenge.
arXiv Detail & Related papers (2022-05-12T11:15:47Z) - Continual Repeated Annealed Flow Transport Monte Carlo [93.98285297760671]
We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT)
It combines a sequential Monte Carlo sampler with variational inference using normalizing flows.
We show that CRAFT can achieve impressively accurate results on a lattice field example.
arXiv Detail & Related papers (2022-01-31T10:58:31Z) - Antithetic Riemannian Manifold And Quantum-Inspired Hamiltonian Monte
Carlo [3.686886131767452]
We present new algorithms which are antithetic versions of Hamiltonian Monte Carlo and Quantum-Inspired Hamiltonian Monte Carlo.
Adding antithetic sampling to Hamiltonian Monte Carlo has been previously shown to produce higher effective sample rates compared to vanilla Hamiltonian Monte Carlo.
The analysis is performed on jump diffusion process using real world financial market data, as well as on real world benchmark classification tasks using Bayesian logistic regression.
arXiv Detail & Related papers (2021-07-05T15:03:07Z) - Deterministic Gibbs Sampling via Ordinary Differential Equations [77.42706423573573]
This paper presents a general construction of deterministic measure-preserving dynamics using autonomous ODEs and tools from differential geometry.
We show how Hybrid Monte Carlo and other deterministic samplers follow as special cases of our theory.
arXiv Detail & Related papers (2021-06-18T15:36:09Z) - Annealed Flow Transport Monte Carlo [91.20263039913912]
Annealed Flow Transport (AFT) builds upon Annealed Importance Sampling (AIS) and Sequential Monte Carlo (SMC)
AFT relies on NF which is learned sequentially to push particles towards the successive targets.
We show that a continuous-time scaling limit of the population version of AFT is given by a Feynman--Kac measure.
arXiv Detail & Related papers (2021-02-15T12:05:56Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Connecting the Dots: Numerical Randomized Hamiltonian Monte Carlo with
State-Dependent Event Rates [0.0]
We introduce a robust, easy to use and computationally fast alternative to conventional Markov chain Monte Carlo methods for continuous target distributions.
The proposed algorithm may yield large speedups and improvements in stability relative to relevant benchmarks.
Granted access to a high-quality ODE code, the proposed methodology is both easy to implement and use, even for highly challenging and high-dimensional target distributions.
arXiv Detail & Related papers (2020-05-04T06:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.