Diffusive Gibbs Sampling
- URL: http://arxiv.org/abs/2402.03008v5
- Date: Wed, 29 May 2024 10:20:04 GMT
- Title: Diffusive Gibbs Sampling
- Authors: Wenlin Chen, Mingtian Zhang, Brooks Paige, José Miguel Hernández-Lobato, David Barber,
- Abstract summary: We propose Diffusive Gibbs Sampling (DiGS) for effective sampling from distributions characterized by distant and disconnected modes.
DiGS integrates recent developments in diffusion models, leveraging Gaussian convolution to create an auxiliary noisy distribution.
A novel Metropolis-within-Gibbs scheme is proposed to enhance mixing in the denoising sampling step.
- Score: 40.1197715949575
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The inadequate mixing of conventional Markov Chain Monte Carlo (MCMC) methods for multi-modal distributions presents a significant challenge in practical applications such as Bayesian inference and molecular dynamics. Addressing this, we propose Diffusive Gibbs Sampling (DiGS), an innovative family of sampling methods designed for effective sampling from distributions characterized by distant and disconnected modes. DiGS integrates recent developments in diffusion models, leveraging Gaussian convolution to create an auxiliary noisy distribution that bridges isolated modes in the original space and applying Gibbs sampling to alternately draw samples from both spaces. A novel Metropolis-within-Gibbs scheme is proposed to enhance mixing in the denoising sampling step. DiGS exhibits a better mixing property for sampling multi-modal distributions than state-of-the-art methods such as parallel tempering, attaining substantially improved performance across various tasks, including mixtures of Gaussians, Bayesian neural networks and molecular dynamics.
Related papers
- Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Enhancing Score-Based Sampling Methods with Ensembles [0.0]
We introduce the underlying methodology, emphasizing its relationship with generative diffusion models and the previously introduced F"ollmer sampler.
We demonstrate the efficacy of ensemble strategies through various examples, including low- to medium-dimensionality sampling problems.
Our findings highlight the potential of ensemble strategies for modeling complex probability distributions in situations where gradients are unavailable.
arXiv Detail & Related papers (2024-01-31T01:51:29Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Moment Matching Denoising Gibbs Sampling [14.75945343063504]
Energy-Based Models (EBMs) offer a versatile framework for modeling complex data distributions.
The widely-used Denoising Score Matching (DSM) method for scalable EBM training suffers from inconsistency issues.
We propose an efficient sampling framework: (pseudo)-Gibbs sampling with moment matching.
arXiv Detail & Related papers (2023-05-19T12:58:25Z) - Jump-Diffusion Langevin Dynamics for Multimodal Posterior Sampling [3.4483987421251516]
We investigate the performance of a hybrid Metropolis and Langevin sampling method akin to Jump Diffusion on a range of synthetic and real data.
We find that careful calibration of mixing sampling jumps with gradient based chains significantly outperforms both pure gradient-based or sampling based schemes.
arXiv Detail & Related papers (2022-11-02T17:35:04Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.