Randomized Runge-Kutta-Nyström Methods for Unadjusted Hamiltonian and Kinetic Langevin Monte Carlo
- URL: http://arxiv.org/abs/2310.07399v2
- Date: Thu, 03 Oct 2024 14:31:13 GMT
- Title: Randomized Runge-Kutta-Nyström Methods for Unadjusted Hamiltonian and Kinetic Langevin Monte Carlo
- Authors: Nawaf Bou-Rabee, Tore Selland Kleppe,
- Abstract summary: We introduce $5/2$- and $7/2$-accurate randomized Runge-Kutta-Nystr"om methods for approximating Hamiltonian flows within non-reversible Markov chain Monte Carlo samplers.
The numerical experiments demonstrate the superior efficiency of the proposed unadjusted samplers on a variety of well-behaved, high-dimensional target distributions.
- Score: 0.0
- License:
- Abstract: We introduce $5/2$- and $7/2$-order $L^2$-accurate randomized Runge-Kutta-Nystr\"{o}m methods, tailored for approximating Hamiltonian flows within non-reversible Markov chain Monte Carlo samplers, such as unadjusted Hamiltonian Monte Carlo and unadjusted kinetic Langevin Monte Carlo. We establish quantitative $5/2$-order $L^2$-accuracy upper bounds under gradient and Hessian Lipschitz assumptions on the potential energy function. The numerical experiments demonstrate the superior efficiency of the proposed unadjusted samplers on a variety of well-behaved, high-dimensional target distributions.
Related papers
- Provable Benefit of Annealed Langevin Monte Carlo for Non-log-concave Sampling [28.931489333515618]
We establish an oracle complexity of $widetildeOleft(fracdbeta2cal A2varepsilon6right)$ for simple annealed Langevin Monte Carlo algorithm.
We show that $cal A$ represents the action of a curve of probability measures interpolating the target distribution $pi$ and a readily sampleable distribution.
arXiv Detail & Related papers (2024-07-24T02:15:48Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Proximal Algorithms for Accelerated Langevin Dynamics [57.08271964961975]
We develop a novel class of MCMC algorithms based on a stochastized Nesterov scheme.
We show superior performance of the proposed method over typical Langevin samplers for different models in statistics and image processing.
arXiv Detail & Related papers (2023-11-24T19:56:01Z) - Unbiased Kinetic Langevin Monte Carlo with Inexact Gradients [0.8749675983608172]
We present an unbiased method for posterior means based on kinetic Langevin dynamics.
Our proposed estimator is unbiased, attains finite variance, and satisfies a central limit theorem.
Our results demonstrate that in large-scale applications, the unbiased algorithm we present can be 2-3 orders of magnitude more efficient than the gold-standard" randomized Hamiltonian Monte Carlo.
arXiv Detail & Related papers (2023-11-08T21:19:52Z) - Continual Repeated Annealed Flow Transport Monte Carlo [93.98285297760671]
We propose Continual Repeated Annealed Flow Transport Monte Carlo (CRAFT)
It combines a sequential Monte Carlo sampler with variational inference using normalizing flows.
We show that CRAFT can achieve impressively accurate results on a lattice field example.
arXiv Detail & Related papers (2022-01-31T10:58:31Z) - Mean-Square Analysis with An Application to Optimal Dimension Dependence
of Langevin Monte Carlo [60.785586069299356]
This work provides a general framework for the non-asymotic analysis of sampling error in 2-Wasserstein distance.
Our theoretical analysis is further validated by numerical experiments.
arXiv Detail & Related papers (2021-09-08T18:00:05Z) - An Introduction to Hamiltonian Monte Carlo Method for Sampling [26.555110725656963]
Hamiltonian Monte Carlo (HMC) is a Hamiltonian dynamics-inspired algorithm for sampling from a Gibbs density $pi(x) propto e-f(x)$.
We show that idealized HMC preserves $pi$ and we establish its convergence when $f$ is strongly convex and smooth.
arXiv Detail & Related papers (2021-08-27T03:28:20Z) - Antithetic Riemannian Manifold And Quantum-Inspired Hamiltonian Monte
Carlo [3.686886131767452]
We present new algorithms which are antithetic versions of Hamiltonian Monte Carlo and Quantum-Inspired Hamiltonian Monte Carlo.
Adding antithetic sampling to Hamiltonian Monte Carlo has been previously shown to produce higher effective sample rates compared to vanilla Hamiltonian Monte Carlo.
The analysis is performed on jump diffusion process using real world financial market data, as well as on real world benchmark classification tasks using Bayesian logistic regression.
arXiv Detail & Related papers (2021-07-05T15:03:07Z) - Annealed Flow Transport Monte Carlo [91.20263039913912]
Annealed Flow Transport (AFT) builds upon Annealed Importance Sampling (AIS) and Sequential Monte Carlo (SMC)
AFT relies on NF which is learned sequentially to push particles towards the successive targets.
We show that a continuous-time scaling limit of the population version of AFT is given by a Feynman--Kac measure.
arXiv Detail & Related papers (2021-02-15T12:05:56Z) - A New Framework for Variance-Reduced Hamiltonian Monte Carlo [88.84622104944503]
We propose a new framework of variance-reduced Hamiltonian Monte Carlo (HMC) methods for sampling from an $L$-smooth and $m$-strongly log-concave distribution.
We show that the unbiased gradient estimators, including SAGA and SVRG, based HMC methods achieve highest gradient efficiency with small batch size.
Experimental results on both synthetic and real-world benchmark data show that our new framework significantly outperforms the full gradient and gradient HMC approaches.
arXiv Detail & Related papers (2021-02-09T02:44:24Z) - Generative Neural Samplers for the Quantum Heisenberg Chain [0.3655021726150368]
Generative neural samplers offer a complementary approach to Monte Carlo methods for problems in statistical physics and quantum field theory.
This work tests the ability of generative neural samplers to estimate observables for real-world low-dimensional spin systems.
arXiv Detail & Related papers (2020-12-18T14:28:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.