Continuously-Tempered PDMP Samplers
- URL: http://arxiv.org/abs/2205.09559v1
- Date: Thu, 19 May 2022 13:32:44 GMT
- Title: Continuously-Tempered PDMP Samplers
- Authors: Matthew Sutton, Robert Salomone, Augustin Chevallier, Paul Fearnhead
- Abstract summary: We show how tempering ideas can improve the mixing of piece-wise deterministic Markov processes.
We introduce an extended distribution defined over the state of the posterior distribution and an inverse temperature.
We show how PDMPs, and particularly the Zig-Zag sampler, can be implemented to sample from such an extended distribution.
- Score: 2.294014185517203
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: New sampling algorithms based on simulating continuous-time stochastic
processes called piece-wise deterministic Markov processes (PDMPs) have shown
considerable promise. However, these methods can struggle to sample from
multi-modal or heavy-tailed distributions. We show how tempering ideas can
improve the mixing of PDMPs in such cases. We introduce an extended
distribution defined over the state of the posterior distribution and an
inverse temperature, which interpolates between a tractable distribution when
the inverse temperature is 0 and the posterior when the inverse temperature is
1. The marginal distribution of the inverse temperature is a mixture of a
continuous distribution on [0,1) and a point mass at 1: which means that we
obtain samples when the inverse temperature is 1, and these are draws from the
posterior, but sampling algorithms will also explore distributions at lower
temperatures which will improve mixing. We show how PDMPs, and particularly the
Zig-Zag sampler, can be implemented to sample from such an extended
distribution. The resulting algorithm is easy to implement and we show
empirically that it can outperform existing PDMP-based samplers on challenging
multimodal posteriors.
Related papers
- Enhancing Diffusion Posterior Sampling for Inverse Problems by Integrating Crafted Measurements [45.70011319850862]
Diffusion models have emerged as a powerful foundation model for visual generation.
Current posterior sampling based methods take the measurement into the posterior sampling to infer the distribution of the target data.
We show that high-frequency information can be prematurely introduced during the early stages, which could induce larger posterior estimate errors.
We propose a novel diffusion posterior sampling method DPS-CM, which incorporates a Crafted Measurement.
arXiv Detail & Related papers (2024-11-15T00:06:57Z) - Think Twice Before You Act: Improving Inverse Problem Solving With MCMC [40.5682961122897]
We propose textbfDiffusion textbfPosterior textbfMCMC (textbfDPMC) to solve inverse problems with pretrained diffusion models.
Our algorithm outperforms DPS with less number of evaluations across nearly all tasks, and is competitive among existing approaches.
arXiv Detail & Related papers (2024-09-13T06:10:54Z) - Piecewise deterministic generative models [35.23259982653664]
We introduce a class of generative models based on piecewise deterministic Markov processes (PDMPs)
We show that jump rates and kernels of the corresponding time reversals admit explicit expressions depending on some conditional densities of the PDMP.
arXiv Detail & Related papers (2024-07-28T09:53:02Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems.
We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Stochastic Gradient Piecewise Deterministic Monte Carlo Samplers [3.487370856323828]
Recent work has suggested using Monte Carlo methods based on piecewise deterministic Markov processes (PDMPs) to sample from target distributions of interest.
We propose approximate simulation of PDMPs with sub-sampling for scalable sampling from posterior distributions.
We show these methods are easy to implement, present results on their approximation error and demonstrate numerically that this class of algorithms has similar efficiency to gradient Langevin dynamics.
arXiv Detail & Related papers (2024-06-27T09:59:28Z) - Boosting Diffusion Models with Moving Average Sampling in Frequency Domain [101.43824674873508]
Diffusion models rely on the current sample to denoise the next one, possibly resulting in denoising instability.
In this paper, we reinterpret the iterative denoising process as model optimization and leverage a moving average mechanism to ensemble all the prior samples.
We name the complete approach "Moving Average Sampling in Frequency domain (MASF)"
arXiv Detail & Related papers (2024-03-26T16:57:55Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic
Analysis For DDIM-Type Samplers [90.45898746733397]
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling.
We show that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current gradient.
arXiv Detail & Related papers (2023-03-06T18:59:19Z) - Denoising Diffusion Samplers [41.796349001299156]
Denoising diffusion models are a popular class of generative models providing state-of-the-art results in many domains.
We explore a similar idea to sample approximately from unnormalized probability density functions and estimate their normalizing constants.
While score matching is not applicable in this context, we can leverage many of the ideas introduced in generative modeling for Monte Carlo sampling.
arXiv Detail & Related papers (2023-02-27T14:37:16Z) - Learning Energy-Based Models by Diffusion Recovery Likelihood [61.069760183331745]
We present a diffusion recovery likelihood method to tractably learn and sample from a sequence of energy-based models.
After training, synthesized images can be generated by the sampling process that initializes from Gaussian white noise distribution.
On unconditional CIFAR-10 our method achieves FID 9.58 and inception score 8.30, superior to the majority of GANs.
arXiv Detail & Related papers (2020-12-15T07:09:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.