Sampling from multimodal distributions using tempered Hamiltonian
transitions
- URL: http://arxiv.org/abs/2111.06871v1
- Date: Fri, 12 Nov 2021 18:48:36 GMT
- Title: Sampling from multimodal distributions using tempered Hamiltonian
transitions
- Authors: Joonha Park
- Abstract summary: Hamiltonian Monte Carlo methods are widely used to draw samples from unnormalized target densities.
We develop a Hamiltonian Monte Carlo method where the constructed paths can travel across high potential energy barriers.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hamiltonian Monte Carlo (HMC) methods are widely used to draw samples from
unnormalized target densities due to high efficiency and favorable scalability
with respect to increasing space dimensions. However, HMC struggles when the
target distribution is multimodal, because the maximum increase in the
potential energy function (i.e., the negative log density function) along the
simulated path is bounded by the initial kinetic energy, which follows a half
of the $\chi_d^2$ distribution, where d is the space dimension. In this paper,
we develop a Hamiltonian Monte Carlo method where the constructed paths can
travel across high potential energy barriers. This method does not require the
modes of the target distribution to be known in advance. Our approach enables
frequent jumps between the isolated modes of the target density by continuously
varying the mass of the simulated particle while the Hamiltonian path is
constructed. Thus, this method can be considered as a combination of HMC and
the tempered transitions method. Compared to other tempering methods, our
method has a distinctive advantage in the Gibbs sampler settings, where the
target distribution changes at each step. We develop a practical tuning
strategy for our method and demonstrate that it can construct globally mixing
Markov chains targeting high-dimensional, multimodal distributions, using
mixtures of normals and a sensor network localization problem.
Related papers
- Progressive Inference-Time Annealing of Diffusion Models for Sampling from Boltzmann Densities [85.83359661628575]
We propose Progressive Inference-Time Annealing (PITA) to learn diffusion-based samplers.<n>PITA combines two complementary techniques: Annealing of the Boltzmann distribution and Diffusion smoothing.<n>It enables equilibrium sampling of N-body particle systems, Alanine Dipeptide, and tripeptides in Cartesian coordinates.
arXiv Detail & Related papers (2025-06-19T17:14:22Z) - Diffusion-based supervised learning of generative models for efficient sampling of multimodal distributions [16.155593250605254]
We propose a hybrid generative model for efficient sampling of high-dimensional, multimodal probability distributions for Bayesian inference.<n>Our numerical examples demonstrate that the proposed framework can effectively handle multimodal distributions with varying mode shapes in up to 100 dimensions.
arXiv Detail & Related papers (2025-04-20T21:06:02Z) - Enhancing Gradient-based Discrete Sampling via Parallel Tempering [8.195708231156546]
A gradient-based discrete sampler is susceptible to getting trapped in local minima in high-dimensional, multimodal discrete distributions.
We develop a discrete Langevin proposal that combines parallel tempering, also known as replica exchange, with the discrete Langevin proposal.
We show that our algorithm converges non-asymptotically to the target energy and exhibits faster mixing compared to a single chain.
arXiv Detail & Related papers (2025-02-26T15:51:15Z) - Policy Gradients for Optimal Parallel Tempering MCMC [0.276240219662896]
Parallel tempering is a meta-algorithm for Markov Chain Monte Carlo that uses multiple chains to sample from tempered versions of the target distribution.
We present an adaptive temperature selection algorithm that dynamically adjusts temperatures during sampling using a policy gradient approach.
arXiv Detail & Related papers (2024-09-03T03:12:45Z) - Repelling-Attracting Hamiltonian Monte Carlo [0.8158530638728501]
We propose a variant of Hamiltonian Monte Carlo, called the Repelling-Attracting Hamiltonian Monte Carlo (RAHMC)
RAHMC involves two stages: a mode-repelling stage to encourage the sampler to move away from regions of high probability density; and, a mode-attracting stage, which facilitates the sampler to find and settle near alternative modes.
arXiv Detail & Related papers (2024-03-07T15:54:55Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Diffusive Gibbs Sampling [40.1197715949575]
We propose Diffusive Gibbs Sampling (DiGS) for effective sampling from distributions characterized by distant and disconnected modes.
DiGS integrates recent developments in diffusion models, leveraging Gaussian convolution to create an auxiliary noisy distribution.
A novel Metropolis-within-Gibbs scheme is proposed to enhance mixing in the denoising sampling step.
arXiv Detail & Related papers (2024-02-05T13:47:41Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Differentiating Metropolis-Hastings to Optimize Intractable Densities [51.16801956665228]
We develop an algorithm for automatic differentiation of Metropolis-Hastings samplers.
We apply gradient-based optimization to objectives expressed as expectations over intractable target densities.
arXiv Detail & Related papers (2023-06-13T17:56:02Z) - Quantum chaos and thermalization in the two-mode Dicke model [77.34726150561087]
We discuss the onset of quantum chaos and thermalization in the two-mode Dicke model.
The two-mode Dicke model exhibits normal to superradiant quantum phase transition.
We show that the temporal fluctuations of the expectation value of the collective spin observable around its average are small and decrease with the effective system size.
arXiv Detail & Related papers (2022-07-08T11:16:29Z) - Reconstructing the Universe with Variational self-Boosted Sampling [7.922637707393503]
Traditional algorithms such as Hamiltonian Monte Carlo (HMC) are computationally inefficient due to generating correlated samples.
Here we develop a hybrid scheme called variational self-boosted sampling (VBS) to mitigate the drawbacks of both algorithms.
VBS generates better quality of samples than simple VI approaches and reduces the correlation length in the sampling phase by a factor of 10-50 over using only HMC.
arXiv Detail & Related papers (2022-06-28T21:30:32Z) - Photoinduced prethermal order parameter dynamics in the two-dimensional
large-$N$ Hubbard-Heisenberg model [77.34726150561087]
We study the microscopic dynamics of competing ordered phases in a two-dimensional correlated electron model.
We simulate the light-induced transition between two competing phases.
arXiv Detail & Related papers (2022-05-13T13:13:31Z) - Deterministic Gibbs Sampling via Ordinary Differential Equations [77.42706423573573]
This paper presents a general construction of deterministic measure-preserving dynamics using autonomous ODEs and tools from differential geometry.
We show how Hybrid Monte Carlo and other deterministic samplers follow as special cases of our theory.
arXiv Detail & Related papers (2021-06-18T15:36:09Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Ensemble Slice Sampling: Parallel, black-box and gradient-free inference
for correlated & multimodal distributions [0.0]
Slice Sampling has emerged as a powerful Markov Chain Monte Carlo algorithm that adapts to the characteristics of the target distribution with minimal hand-tuning.
This paper introduces Ensemble Slice Sampling (ESS), a new class of algorithms that bypasses such difficulties by adaptively tuning the initial length scale.
These affine-invariant algorithms are trivial to construct, require no hand-tuning, and can easily be implemented in parallel computing environments.
arXiv Detail & Related papers (2020-02-14T19:00:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.