Repelling-Attracting Hamiltonian Monte Carlo
- URL: http://arxiv.org/abs/2403.04607v1
- Date: Thu, 7 Mar 2024 15:54:55 GMT
- Title: Repelling-Attracting Hamiltonian Monte Carlo
- Authors: Siddharth Vishwanath and Hyungsuk Tak
- Abstract summary: We propose a variant of Hamiltonian Monte Carlo, called the Repelling-Attracting Hamiltonian Monte Carlo (RAHMC)
RAHMC involves two stages: a mode-repelling stage to encourage the sampler to move away from regions of high probability density; and, a mode-attracting stage, which facilitates the sampler to find and settle near alternative modes.
- Score: 0.8158530638728501
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a variant of Hamiltonian Monte Carlo (HMC), called the
Repelling-Attracting Hamiltonian Monte Carlo (RAHMC), for sampling from
multimodal distributions. The key idea that underpins RAHMC is a departure from
the conservative dynamics of Hamiltonian systems, which form the basis of
traditional HMC, and turning instead to the dissipative dynamics of conformal
Hamiltonian systems. In particular, RAHMC involves two stages: a mode-repelling
stage to encourage the sampler to move away from regions of high probability
density; and, a mode-attracting stage, which facilitates the sampler to find
and settle near alternative modes. We achieve this by introducing just one
additional tuning parameter -- the coefficient of friction. The proposed method
adapts to the geometry of the target distribution, e.g., modes and density
ridges, and can generate proposals that cross low-probability barriers with
little to no computational overhead in comparison to traditional HMC. Notably,
RAHMC requires no additional information about the target distribution or
memory of previously visited modes. We establish the theoretical basis for
RAHMC, and we discuss repelling-attracting extensions to several variants of
HMC in literature. Finally, we provide a tuning-free implementation via
dual-averaging, and we demonstrate its effectiveness in sampling from, both,
multimodal and unimodal distributions in high dimensions.
Related papers
- Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Learning Energy-Based Prior Model with Diffusion-Amortized MCMC [89.95629196907082]
Common practice of learning latent space EBMs with non-convergent short-run MCMC for prior and posterior sampling is hindering the model from further progress.
We introduce a simple but effective diffusion-based amortization method for long-run MCMC sampling and develop a novel learning algorithm for the latent space EBM based on it.
arXiv Detail & Related papers (2023-10-05T00:23:34Z) - Chebyshev Particles [0.0]
We are first to consider the posterior distribution of the objective as a mapping of samples in an infinite-dimensional Euclidean space.
We propose a new criterion by maximizing the weighted Riesz polarization quantity, to discretize rectifiable submanifolds via pairwise interaction.
We have achieved high performance from the experiments for parameter inference in a linear state-space model with synthetic data and a non-linear volatility model with real-world data.
arXiv Detail & Related papers (2023-09-10T16:40:30Z) - Modiff: Action-Conditioned 3D Motion Generation with Denoising Diffusion
Probabilistic Models [58.357180353368896]
We propose a conditional paradigm that benefits from the denoising diffusion probabilistic model (DDPM) to tackle the problem of realistic and diverse action-conditioned 3D skeleton-based motion generation.
We are a pioneering attempt that uses DDPM to synthesize a variable number of motion sequences conditioned on a categorical action.
arXiv Detail & Related papers (2023-01-10T13:15:42Z) - Enhanced gradient-based MCMC in discrete spaces [2.7158841992922875]
We introduce several discrete Metropolis-Hastings samplers that are conceptually-inspired by MALA.
We demonstrate their strong empirical performance across a range of challenging sampling problems in Bayesian inference and energy-based modelling.
arXiv Detail & Related papers (2022-07-29T18:48:49Z) - Sampling from high-dimensional, multimodal distributions using automatically tuned, tempered Hamiltonian Monte Carlo [0.0]
Hamiltonian Monte Carlo (HMC) is widely used for sampling from high-dimensional target distributions with probability density known up to proportionality.
Traditional tempering methods, commonly used to address multimodality, can be difficult to tune, particularly in high dimensions.
We propose a method that combines a tempering strategy with Hamiltonian Monte Carlo, enabling efficient sampling from high-dimensional, strongly multimodal distributions.
arXiv Detail & Related papers (2021-11-12T18:48:36Z) - Hamiltonian Dynamics with Non-Newtonian Momentum for Rapid Sampling [38.367354572578314]
Sampling from an unnormalized probability distribution is a fundamental problem in machine learning.
We propose a fundamentally different approach to this problem via a new Hamiltonian dynamics with a non-Newtonian momentum.
In contrast to MCMC approaches like Hamiltonian Monte Carlo, no step is required. Instead, the proposed deterministic dynamics in an extended state space exactly sample the target distribution.
arXiv Detail & Related papers (2021-11-03T18:00:07Z) - Entropy-based adaptive Hamiltonian Monte Carlo [19.358300726820943]
Hamiltonian Monte Carlo (HMC) is a popular Markov Chain Monte Carlo (MCMC) algorithm to sample from an unnormalized probability distribution.
A leapfrog integrator is commonly used to implement HMC in practice, but its performance can be sensitive to the choice of mass matrix used.
We develop a gradient-based algorithm that allows for the adaptation of the mass matrix by encouraging the leapfrog integrator to have high acceptance rates.
arXiv Detail & Related papers (2021-10-27T17:52:55Z) - What Are Bayesian Neural Network Posteriors Really Like? [63.950151520585024]
We show that Hamiltonian Monte Carlo can achieve significant performance gains over standard and deep ensembles.
We also show that deep distributions are similarly close to HMC as standard SGLD, and closer than standard variational inference.
arXiv Detail & Related papers (2021-04-29T15:38:46Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.