Proximal Algorithms for Accelerated Langevin Dynamics
- URL: http://arxiv.org/abs/2311.14829v2
- Date: Tue, 28 Nov 2023 15:27:26 GMT
- Title: Proximal Algorithms for Accelerated Langevin Dynamics
- Authors: Duy H. Thai, Alexander L. Young, David B. Dunson
- Abstract summary: We develop a novel class of MCMC algorithms based on a stochastized Nesterov scheme.
We show superior performance of the proposed method over typical Langevin samplers for different models in statistics and image processing.
- Score: 57.08271964961975
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop a novel class of MCMC algorithms based on a stochastized Nesterov
scheme. With an appropriate addition of noise, the result is a
time-inhomogeneous underdamped Langevin equation, which we prove emits a
specified target distribution as its invariant measure. Convergence rates to
stationarity under Wasserstein-2 distance are established as well.
Metropolis-adjusted and stochastic gradient versions of the proposed Langevin
dynamics are also provided. Experimental illustrations show superior
performance of the proposed method over typical Langevin samplers for different
models in statistics and image processing including better mixing of the
resulting Markov chains.
Related papers
- Symmetric Mean-field Langevin Dynamics for Distributional Minimax
Problems [78.96969465641024]
We extend mean-field Langevin dynamics to minimax optimization over probability distributions for the first time with symmetric and provably convergent updates.
We also study time and particle discretization regimes and prove a new uniform-in-time propagation of chaos result.
arXiv Detail & Related papers (2023-12-02T13:01:29Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Accelerated Bayesian imaging by relaxed proximal-point Langevin sampling [4.848683707039751]
This paper presents a new accelerated proximal Markov chain Monte Carlo methodology to perform Bayesian inference in imaging inverse problems.
For models that are smooth or regularised by Moreau-Yosida smoothing, the midpoint is equivalent to an implicit discretisation of an overdamped Langevin diffusion.
For targets that are $kappa$-strongly log-concave, the provided non-asymptotic convergence analysis also identifies the optimal time step.
arXiv Detail & Related papers (2023-08-18T10:55:49Z) - Differentiating Metropolis-Hastings to Optimize Intractable Densities [51.16801956665228]
We develop an algorithm for automatic differentiation of Metropolis-Hastings samplers.
We apply gradient-based optimization to objectives expressed as expectations over intractable target densities.
arXiv Detail & Related papers (2023-06-13T17:56:02Z) - Reweighted Interacting Langevin Diffusions: an Accelerated Sampling
Methodfor Optimization [28.25662317591378]
We propose a new technique to accelerate sampling methods for solving difficult optimization problems.
Our method investigates the connection between posterior distribution sampling and optimization with Langevin dynamics.
arXiv Detail & Related papers (2023-01-30T03:48:20Z) - Score-Guided Intermediate Layer Optimization: Fast Langevin Mixing for
Inverse Problem [97.64313409741614]
We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators.
We propose to do posterior sampling in the latent space of a pre-trained generative model.
arXiv Detail & Related papers (2022-06-18T03:47:37Z) - A Contour Stochastic Gradient Langevin Dynamics Algorithm for
Simulations of Multi-modal Distributions [17.14287157979558]
We propose an adaptively weighted gradient Langevin dynamics (SGLD) for learning in big data statistics.
The proposed algorithm is tested on benchmark datasets including CIFAR100.
arXiv Detail & Related papers (2020-10-19T19:20:47Z) - Hessian-Free High-Resolution Nesterov Acceleration for Sampling [55.498092486970364]
Nesterov's Accelerated Gradient (NAG) for optimization has better performance than its continuous time limit (noiseless kinetic Langevin) when a finite step-size is employed.
This work explores the sampling counterpart of this phenonemon and proposes a diffusion process, whose discretizations can yield accelerated gradient-based MCMC methods.
arXiv Detail & Related papers (2020-06-16T15:07:37Z) - Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient
Clipping [69.9674326582747]
We propose a new accelerated first-order method called clipped-SSTM for smooth convex optimization with heavy-tailed distributed noise in gradients.
We prove new complexity that outperform state-of-the-art results in this case.
We derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.
arXiv Detail & Related papers (2020-05-21T17:05:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.