Reverse Diffusion Monte Carlo
- URL: http://arxiv.org/abs/2307.02037v3
- Date: Wed, 13 Mar 2024 08:11:18 GMT
- Title: Reverse Diffusion Monte Carlo
- Authors: Xunpeng Huang, Hanze Dong, Yifan Hao, Yi-An Ma, Tong Zhang
- Abstract summary: We propose a novel Monte Carlo sampling algorithm called reverse diffusion Monte Carlo (rdMC)
rdMC is distinct from the Markov chain Monte Carlo (MCMC) methods.
- Score: 19.35592726471155
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a Monte Carlo sampler from the reverse diffusion process. Unlike
the practice of diffusion models, where the intermediary updates -- the score
functions -- are learned with a neural network, we transform the score matching
problem into a mean estimation one. By estimating the means of the regularized
posterior distributions, we derive a novel Monte Carlo sampling algorithm
called reverse diffusion Monte Carlo (rdMC), which is distinct from the Markov
chain Monte Carlo (MCMC) methods. We determine the sample size from the error
tolerance and the properties of the posterior distribution to yield an
algorithm that can approximately sample the target distribution with any
desired accuracy. Additionally, we demonstrate and prove under suitable
conditions that sampling with rdMC can be significantly faster than that with
MCMC. For multi-modal target distributions such as those in Gaussian mixture
models, rdMC greatly improves over the Langevin-style MCMC sampling methods
both theoretically and in practice. The proposed rdMC method offers a new
perspective and solution beyond classical MCMC algorithms for the challenging
complex distributions.
Related papers
- Convergence Bounds for Sequential Monte Carlo on Multimodal Distributions using Soft Decomposition [6.872242798058046]
We prove bounds on the variance of a function $f$ under the empirical measure of the samples obtained by the Sequential Monte Carlo (SMC) algorithm.
We show that bounds can be obtained in the truly multi-modal setting, with mixing times that depend on local MCMC dynamics.
arXiv Detail & Related papers (2024-05-29T22:43:45Z) - Zeroth-Order Sampling Methods for Non-Log-Concave Distributions: Alleviating Metastability by Denoising Diffusion [13.832359344815043]
This paper considers the problem of sampling from non-logconcave distribution, based on queries of its unnormalized density.
It first describes a framework, Denoising Diffusion Monte Carlo (DDMC), based on the simulation of a denoising diffusion process with its score approximated by a Monte Carlo estimator.
We provide an implementation this oracle, based on rejection sampling, and this turns DDMC into a true algorithm, Zeroth-Order Diffusion Monte Carlo (ZOD-MC)
arXiv Detail & Related papers (2024-02-27T21:00:00Z) - Combining Normalizing Flows and Quasi-Monte Carlo [0.0]
Recent advances in machine learning have led to the development of new methods for enhancing Monte Carlo methods.
We demonstrate through numerical experiments that this combination can lead to an estimator with significantly lower variance than if the flow was sampled with a classic Monte Carlo.
arXiv Detail & Related papers (2024-01-11T14:17:06Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Accelerating Markov Chain Monte Carlo sampling with diffusion models [0.0]
We introduce a novel method for accelerating Markov Chain Monte Carlo (MCMC) sampling by pairing a Metropolis-Hastings algorithm with a diffusion model.
We briefly review diffusion models in the context of image synthesis before providing a streamlined diffusion model tailored towards low-dimensional data arrays.
Our approach leads to a significant reduction in the number of likelihood evaluations required to obtain an accurate representation of the posterior.
arXiv Detail & Related papers (2023-09-04T09:03:41Z) - Bayesian Decision Trees Inspired from Evolutionary Algorithms [64.80360020499555]
We propose a replacement of the Markov Chain Monte Carlo (MCMC) with an inherently parallel algorithm, the Sequential Monte Carlo (SMC)
Experiments show that SMC combined with the Evolutionary Algorithms (EA) can produce more accurate results compared to MCMC in 100 times fewer iterations.
arXiv Detail & Related papers (2023-05-30T06:17:35Z) - Low-variance estimation in the Plackett-Luce model via quasi-Monte Carlo
sampling [58.14878401145309]
We develop a novel approach to producing more sample-efficient estimators of expectations in the PL model.
We illustrate our findings both theoretically and empirically using real-world recommendation data from Amazon Music and the Yahoo learning-to-rank challenge.
arXiv Detail & Related papers (2022-05-12T11:15:47Z) - Continual Repeated Annealed Flow Transport Monte Carlo [93.98285297760671]
We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT)
It combines a sequential Monte Carlo sampler with variational inference using normalizing flows.
We show that CRAFT can achieve impressively accurate results on a lattice field example.
arXiv Detail & Related papers (2022-01-31T10:58:31Z) - Annealed Flow Transport Monte Carlo [91.20263039913912]
Annealed Flow Transport (AFT) builds upon Annealed Importance Sampling (AIS) and Sequential Monte Carlo (SMC)
AFT relies on NF which is learned sequentially to push particles towards the successive targets.
We show that a continuous-time scaling limit of the population version of AFT is given by a Feynman--Kac measure.
arXiv Detail & Related papers (2021-02-15T12:05:56Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Non-convex Learning via Replica Exchange Stochastic Gradient MCMC [25.47669573608621]
We propose an adaptive replica exchange SGMCMC (reSGMCMC) to automatically correct the bias and study the corresponding properties.
Empirically, we test the algorithm through extensive experiments on various setups and obtain the results.
arXiv Detail & Related papers (2020-08-12T15:02:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.