Unadjusted Langevin algorithm for sampling a mixture of weakly smooth
potentials
- URL: http://arxiv.org/abs/2112.09311v1
- Date: Fri, 17 Dec 2021 04:10:09 GMT
- Title: Unadjusted Langevin algorithm for sampling a mixture of weakly smooth
potentials
- Authors: Dao Nguyen
- Abstract summary: We prove convergence guarantees under Poincar'e inequality or non-strongly convex outside the ball.
We also provide convergence in $L_beta$-Wasserstein metric for the smoothing potential.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Discretization of continuous-time diffusion processes is a widely recognized
method for sampling. However, it seems to be a considerable restriction when
the potentials are often required to be smooth (gradient Lipschitz). This paper
studies the problem of sampling through Euler discretization, where the
potential function is assumed to be a mixture of weakly smooth distributions
and satisfies weakly dissipative. We establish the convergence in
Kullback-Leibler (KL) divergence with the number of iterations to reach
$\epsilon$-neighborhood of a target distribution in only polynomial dependence
on the dimension. We relax the degenerated convex at infinity conditions of
\citet{erdogdu2020convergence} and prove convergence guarantees under
Poincar\'{e} inequality or non-strongly convex outside the ball. In addition,
we also provide convergence in $L_{\beta}$-Wasserstein metric for the smoothing
potential.
Related papers
- Provable Convergence and Limitations of Geometric Tempering for Langevin Dynamics [8.683011785637824]
Geometric tempering is a popular approach to sampling from challenging multi-modal probability distributions.
In this paper, we theoretically investigate the soundness of this approach when the sampling algorithm is Langevin dynamics.
Our results indicate that geometric tempering may not help, and can even be harmful for convergence.
arXiv Detail & Related papers (2024-10-13T02:24:31Z) - Tamed Langevin sampling under weaker conditions [27.872857402255775]
We investigate the problem of sampling from distributions that are not log-concave and are only weakly dissipative.
We introduce a taming scheme which is tailored to the growth and decay properties of the target distribution.
We provide explicit non-asymptotic guarantees for the proposed sampler in terms of the Kullback-Leibler divergence, total variation, and Wasserstein distance to the target distribution.
arXiv Detail & Related papers (2024-05-27T23:00:40Z) - In-and-Out: Algorithmic Diffusion for Sampling Convex Bodies [7.70133333709347]
We present a new random walk for uniformly sampling high-dimensional convex bodies.
It achieves state-of-the-art runtime complexity with stronger guarantees on the output.
arXiv Detail & Related papers (2024-05-02T16:15:46Z) - Breaking the Heavy-Tailed Noise Barrier in Stochastic Optimization Problems [56.86067111855056]
We consider clipped optimization problems with heavy-tailed noise with structured density.
We show that it is possible to get faster rates of convergence than $mathcalO(K-(alpha - 1)/alpha)$, when the gradients have finite moments of order.
We prove that the resulting estimates have negligible bias and controllable variance.
arXiv Detail & Related papers (2023-11-07T17:39:17Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Convergence of Adam Under Relaxed Assumptions [72.24779199744954]
We show that Adam converges to $epsilon$-stationary points with $O(epsilon-4)$ gradient complexity under far more realistic conditions.
We also propose a variance-reduced version of Adam with an accelerated gradient complexity of $O(epsilon-3)$.
arXiv Detail & Related papers (2023-04-27T06:27:37Z) - High-Probability Bounds for Stochastic Optimization and Variational
Inequalities: the Case of Unbounded Variance [59.211456992422136]
We propose algorithms with high-probability convergence results under less restrictive assumptions.
These results justify the usage of the considered methods for solving problems that do not fit standard functional classes in optimization.
arXiv Detail & Related papers (2023-02-02T10:37:23Z) - Concentration analysis of multivariate elliptic diffusion processes [0.0]
We prove concentration inequalities and associated PAC bounds for continuous- and discrete-time additive functionals.
Our analysis relies on an approach via the Poisson equation allowing us to consider a very broad class of subexponentially ergodic processes.
arXiv Detail & Related papers (2022-06-07T14:15:05Z) - Minibatch vs Local SGD with Shuffling: Tight Convergence Bounds and
Beyond [63.59034509960994]
We study shuffling-based variants: minibatch and local Random Reshuffling, which draw gradients without replacement.
For smooth functions satisfying the Polyak-Lojasiewicz condition, we obtain convergence bounds which show that these shuffling-based variants converge faster than their with-replacement counterparts.
We propose an algorithmic modification called synchronized shuffling that leads to convergence rates faster than our lower bounds in near-homogeneous settings.
arXiv Detail & Related papers (2021-10-20T02:25:25Z) - Faster Convergence of Stochastic Gradient Langevin Dynamics for
Non-Log-Concave Sampling [110.88857917726276]
We provide a new convergence analysis of gradient Langevin dynamics (SGLD) for sampling from a class of distributions that can be non-log-concave.
At the core of our approach is a novel conductance analysis of SGLD using an auxiliary time-reversible Markov Chain.
arXiv Detail & Related papers (2020-10-19T15:23:18Z) - Stochastic Saddle-Point Optimization for Wasserstein Barycenters [69.68068088508505]
We consider the populationimation barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data.
We employ the structure of the problem and obtain a convex-concave saddle-point reformulation of this problem.
In the setting when the distribution of random probability measures is discrete, we propose an optimization algorithm and estimate its complexity.
arXiv Detail & Related papers (2020-06-11T19:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.