Accelerating Markov Chain Monte Carlo sampling with diffusion models
- URL: http://arxiv.org/abs/2309.01454v1
- Date: Mon, 4 Sep 2023 09:03:41 GMT
- Title: Accelerating Markov Chain Monte Carlo sampling with diffusion models
- Authors: N. T. Hunt-Smith, W. Melnitchouk, F. Ringer, N. Sato, A. W Thomas, M.
J. White
- Abstract summary: We introduce a novel method for accelerating Markov Chain Monte Carlo (MCMC) sampling by pairing a Metropolis-Hastings algorithm with a diffusion model.
We briefly review diffusion models in the context of image synthesis before providing a streamlined diffusion model tailored towards low-dimensional data arrays.
Our approach leads to a significant reduction in the number of likelihood evaluations required to obtain an accurate representation of the posterior.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Global fits of physics models require efficient methods for exploring
high-dimensional and/or multimodal posterior functions. We introduce a novel
method for accelerating Markov Chain Monte Carlo (MCMC) sampling by pairing a
Metropolis-Hastings algorithm with a diffusion model that can draw global
samples with the aim of approximating the posterior. We briefly review
diffusion models in the context of image synthesis before providing a
streamlined diffusion model tailored towards low-dimensional data arrays. We
then present our adapted Metropolis-Hastings algorithm which combines local
proposals with global proposals taken from a diffusion model that is regularly
trained on the samples produced during the MCMC run. Our approach leads to a
significant reduction in the number of likelihood evaluations required to
obtain an accurate representation of the Bayesian posterior across several
analytic functions, as well as for a physical example based on a global
analysis of parton distribution functions. Our method is extensible to other
MCMC techniques, and we briefly compare our method to similar approaches based
on normalizing flows. A code implementation can be found at
https://github.com/NickHunt-Smith/MCMC-diffusion.
Related papers
- Understanding Reinforcement Learning-Based Fine-Tuning of Diffusion Models: A Tutorial and Review [63.31328039424469]
This tutorial provides a comprehensive survey of methods for fine-tuning diffusion models to optimize downstream reward functions.
We explain the application of various RL algorithms, including PPO, differentiable optimization, reward-weighted MLE, value-weighted sampling, and path consistency learning.
arXiv Detail & Related papers (2024-07-18T17:35:32Z) - The Poisson Midpoint Method for Langevin Dynamics: Provably Efficient Discretization for Diffusion Models [9.392691963008385]
Langevin Monte Carlo (LMC) is the simplest and most studied algorithm.
We propose the Poisson Midpoint Method, which approximates a small step-size LMC with large step-sizes.
We show that it maintains the quality of DDPM with 1000 neural network calls with just 50-80 neural network calls and outperforms ODE based methods with similar compute.
arXiv Detail & Related papers (2024-05-27T11:40:42Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.
We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.
Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z) - Reverse Diffusion Monte Carlo [19.35592726471155]
We propose a novel Monte Carlo sampling algorithm called reverse diffusion Monte Carlo (rdMC)
rdMC is distinct from the Markov chain Monte Carlo (MCMC) methods.
arXiv Detail & Related papers (2023-07-05T05:42:03Z) - Object based Bayesian full-waveform inversion for shear elastography [0.0]
We develop a computational framework to quantify uncertainty in shear elastography imaging of anomalies in tissues.
We find the posterior probability of parameter fields representing the geometry of the anomalies and their shear moduli.
We demonstrate the approach on synthetic two dimensional tests with smooth and irregular shapes.
arXiv Detail & Related papers (2023-05-11T08:25:25Z) - Fast Inference in Denoising Diffusion Models via MMD Finetuning [23.779985842891705]
We present MMD-DDM, a novel method for fast sampling of diffusion models.
Our approach is based on the idea of using the Maximum Mean Discrepancy (MMD) to finetune the learned distribution with a given budget of timesteps.
Our findings show that the proposed method is able to produce high-quality samples in a fraction of the time required by widely-used diffusion models.
arXiv Detail & Related papers (2023-01-19T09:48:07Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Oops I Took A Gradient: Scalable Sampling for Discrete Distributions [53.3142984019796]
We show that this approach outperforms generic samplers in a number of difficult settings.
We also demonstrate the use of our improved sampler for training deep energy-based models on high dimensional discrete data.
arXiv Detail & Related papers (2021-02-08T20:08:50Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Model Fusion with Kullback--Leibler Divergence [58.20269014662046]
We propose a method to fuse posterior distributions learned from heterogeneous datasets.
Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors.
arXiv Detail & Related papers (2020-07-13T03:27:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.