Bregman Proximal Langevin Monte Carlo via Bregman--Moreau Envelopes
- URL: http://arxiv.org/abs/2207.04387v1
- Date: Sun, 10 Jul 2022 05:33:06 GMT
- Title: Bregman Proximal Langevin Monte Carlo via Bregman--Moreau Envelopes
- Authors: Tim Tsz-Kit Lau, Han Liu
- Abstract summary: We propose efficient Langevin Monte Carlo algorithms for sampling distributions with nonsmooth convex composite potentials.
The proposed algorithms extend existing Langevin Monte Carlo algorithms in two aspects -- the ability to sample nonsmooth distributions with mirror descent-like algorithms, and the use of the more general Bregman--Moreau envelope in place of the Moreau envelope as a smooth approximation of the nonsmooth part of the potential.
- Score: 10.103413548140848
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose efficient Langevin Monte Carlo algorithms for sampling
distributions with nonsmooth convex composite potentials, which is the sum of a
continuously differentiable function and a possibly nonsmooth function. We
devise such algorithms leveraging recent advances in convex analysis and
optimization methods involving Bregman divergences, namely the Bregman--Moreau
envelopes and the Bregman proximity operators, and in the Langevin Monte Carlo
algorithms reminiscent of mirror descent. The proposed algorithms extend
existing Langevin Monte Carlo algorithms in two aspects -- the ability to
sample nonsmooth distributions with mirror descent-like algorithms, and the use
of the more general Bregman--Moreau envelope in place of the Moreau envelope as
a smooth approximation of the nonsmooth part of the potential. A particular
case of the proposed scheme is reminiscent of the Bregman proximal gradient
algorithm. The efficiency of the proposed methodology is illustrated with
various sampling tasks at which existing Langevin Monte Carlo methods are known
to perform poorly.
Related papers
- Non-linear Quantum Monte Carlo [1.237454174824584]
Quantum computing provides a quadratic speedup over classical Monte Carlo methods for mean estimation.
We propose a quantum-inside-quantum Monte Carlo algorithm that achieves such a speedup for a broad class of non-linear estimation problems.
arXiv Detail & Related papers (2025-02-07T17:13:27Z) - Variable Bregman Majorization-Minimization Algorithm and its Application to Dirichlet Maximum Likelihood Estimation [26.384330822086863]
We propose a novel Bregman descent algorithm for minimizing a convex function that is expressed as the sum of a differentiable part and a possibly nonsmooth term.
The approach, referred to as the Variable Bregman Majorization-Minimization (VBMM) algorithm, extends the Bregman Proximal Gradient method.
Numerical experiments confirm that the VBMM algorithm outperforms existing approaches in terms of convergence speed.
arXiv Detail & Related papers (2025-01-13T13:16:12Z) - Moreau Envelope ADMM for Decentralized Weakly Convex Optimization [55.2289666758254]
This paper proposes a proximal variant of the alternating direction method of multipliers (ADMM) for distributed optimization.
The results of our numerical experiments indicate that our method is faster and more robust than widely-used approaches.
arXiv Detail & Related papers (2023-08-31T14:16:30Z) - Convergent Bregman Plug-and-Play Image Restoration for Poisson Inverse
Problems [8.673558396669806]
Plug-noise-and-Play (Play) methods are efficient iterative algorithms for solving illposed image inverse problems.
We propose two.
algorithms based on the Bregman Score gradient Denoise inverse problems.
arXiv Detail & Related papers (2023-06-06T07:36:47Z) - Gradient-free optimization of highly smooth functions: improved analysis
and a new algorithm [87.22224691317766]
This work studies problems with zero-order noisy oracle information under the assumption that the objective function is highly smooth.
We consider two kinds of zero-order projected gradient descent algorithms.
arXiv Detail & Related papers (2023-06-03T17:05:13Z) - Non-Log-Concave and Nonsmooth Sampling via Langevin Monte Carlo Algorithms [15.718514510878896]
We study the problem of approximate sampling from non-log-concave distributions, e.g., Gaussian mixtures, which is often challenging even in low dimensions due to their multimodality.
We focus on performing this task via Markov chain Monte Carlo (MCMC) methods derived from discretizations of the overdamped Langevin diffusions, which are commonly known as Langevin Monte Carlo algorithms.
arXiv Detail & Related papers (2023-05-25T12:28:26Z) - Fast Computation of Optimal Transport via Entropy-Regularized Extragradient Methods [75.34939761152587]
Efficient computation of the optimal transport distance between two distributions serves as an algorithm that empowers various applications.
This paper develops a scalable first-order optimization-based method that computes optimal transport to within $varepsilon$ additive accuracy.
arXiv Detail & Related papers (2023-01-30T15:46:39Z) - Monte Carlo Tree Descent for Black-Box Optimization [10.698553177585973]
We study how to further integrate sample-based descent for faster optimization.
We design novel ways of expanding Monte Carlo search trees, with new descent methods at vertices.
We show empirically that the proposed algorithms can outperform state-of-the-art methods on many challenging benchmark problems.
arXiv Detail & Related papers (2022-11-01T22:45:10Z) - Langevin Monte Carlo for Contextual Bandits [72.00524614312002]
Langevin Monte Carlo Thompson Sampling (LMC-TS) is proposed to directly sample from the posterior distribution in contextual bandits.
We prove that the proposed algorithm achieves the same sublinear regret bound as the best Thompson sampling algorithms for a special case of contextual bandits.
arXiv Detail & Related papers (2022-06-22T17:58:23Z) - Bayesian imaging using Plug & Play priors: when Langevin meets Tweedie [13.476505672245603]
This paper develops theory, methods, and provably convergent algorithms for performing Bayesian inference with priors.
We introduce two algorithms: 1) -ULA (Unadjusted Langevin) Algorithm inference for Monte Carlo sampling and MMSE; and 2) quantitative-SGD (Stochastic Gradient Descent) for inference.
The algorithms are demonstrated on several problems such as image denoisering, inpainting, and denoising, where they are used for point estimation as well as for uncertainty visualisation and regularity.
arXiv Detail & Related papers (2021-03-08T12:46:53Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Stochastic Saddle-Point Optimization for Wasserstein Barycenters [69.68068088508505]
We consider the populationimation barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data.
We employ the structure of the problem and obtain a convex-concave saddle-point reformulation of this problem.
In the setting when the distribution of random probability measures is discrete, we propose an optimization algorithm and estimate its complexity.
arXiv Detail & Related papers (2020-06-11T19:40:38Z) - Connecting the Dots: Numerical Randomized Hamiltonian Monte Carlo with
State-Dependent Event Rates [0.0]
We introduce a robust, easy to use and computationally fast alternative to conventional Markov chain Monte Carlo methods for continuous target distributions.
The proposed algorithm may yield large speedups and improvements in stability relative to relevant benchmarks.
Granted access to a high-quality ODE code, the proposed methodology is both easy to implement and use, even for highly challenging and high-dimensional target distributions.
arXiv Detail & Related papers (2020-05-04T06:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.