Sampling from multimodal distributions using tempered Hamiltonian
transitions
- URL: http://arxiv.org/abs/2111.06871v1
- Date: Fri, 12 Nov 2021 18:48:36 GMT
- Title: Sampling from multimodal distributions using tempered Hamiltonian
transitions
- Authors: Joonha Park
- Abstract summary: Hamiltonian Monte Carlo methods are widely used to draw samples from unnormalized target densities.
We develop a Hamiltonian Monte Carlo method where the constructed paths can travel across high potential energy barriers.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hamiltonian Monte Carlo (HMC) methods are widely used to draw samples from
unnormalized target densities due to high efficiency and favorable scalability
with respect to increasing space dimensions. However, HMC struggles when the
target distribution is multimodal, because the maximum increase in the
potential energy function (i.e., the negative log density function) along the
simulated path is bounded by the initial kinetic energy, which follows a half
of the $\chi_d^2$ distribution, where d is the space dimension. In this paper,
we develop a Hamiltonian Monte Carlo method where the constructed paths can
travel across high potential energy barriers. This method does not require the
modes of the target distribution to be known in advance. Our approach enables
frequent jumps between the isolated modes of the target density by continuously
varying the mass of the simulated particle while the Hamiltonian path is
constructed. Thus, this method can be considered as a combination of HMC and
the tempered transitions method. Compared to other tempering methods, our
method has a distinctive advantage in the Gibbs sampler settings, where the
target distribution changes at each step. We develop a practical tuning
strategy for our method and demonstrate that it can construct globally mixing
Markov chains targeting high-dimensional, multimodal distributions, using
mixtures of normals and a sensor network localization problem.
Related papers
- Repelling-Attracting Hamiltonian Monte Carlo [0.8158530638728501]
We propose a variant of Hamiltonian Monte Carlo, called the Repelling-Attracting Hamiltonian Monte Carlo (RAHMC)
RAHMC involves two stages: a mode-repelling stage to encourage the sampler to move away from regions of high probability density; and, a mode-attracting stage, which facilitates the sampler to find and settle near alternative modes.
arXiv Detail & Related papers (2024-03-07T15:54:55Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Normalizing flow sampling with Langevin dynamics in the latent space [12.91637880428221]
Normalizing flows (NF) use a continuous generator to map a simple latent (e.g. Gaussian) distribution, towards an empirical target distribution associated with a training data set.
Since standard NF implement differentiable maps, they may suffer from pathological behaviors when targeting complex distributions.
This paper proposes a new Markov chain Monte Carlo algorithm to sample from the target distribution in the latent domain before transporting it back to the target domain.
arXiv Detail & Related papers (2023-05-20T09:31:35Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - GeoDiff: a Geometric Diffusion Model for Molecular Conformation
Generation [102.85440102147267]
We propose a novel generative model named GeoDiff for molecular conformation prediction.
We show that GeoDiff is superior or comparable to existing state-of-the-art approaches.
arXiv Detail & Related papers (2022-03-06T09:47:01Z) - A blob method method for inhomogeneous diffusion with applications to
multi-agent control and sampling [0.6562256987706128]
We develop a deterministic particle method for the weighted porous medium equation (WPME) and prove its convergence on bounded time intervals.
Our method has natural applications to multi-agent coverage algorithms and sampling probability measures.
arXiv Detail & Related papers (2022-02-25T19:49:05Z) - Entropy-based adaptive Hamiltonian Monte Carlo [19.358300726820943]
Hamiltonian Monte Carlo (HMC) is a popular Markov Chain Monte Carlo (MCMC) algorithm to sample from an unnormalized probability distribution.
A leapfrog integrator is commonly used to implement HMC in practice, but its performance can be sensitive to the choice of mass matrix used.
We develop a gradient-based algorithm that allows for the adaptation of the mass matrix by encouraging the leapfrog integrator to have high acceptance rates.
arXiv Detail & Related papers (2021-10-27T17:52:55Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.