Exponential ergodicity of mirror-Langevin diffusions
- URL: http://arxiv.org/abs/2005.09669v2
- Date: Tue, 2 Jun 2020 22:39:02 GMT
- Title: Exponential ergodicity of mirror-Langevin diffusions
- Authors: Sinho Chewi, Thibaut Le Gouic, Chen Lu, Tyler Maunu, Philippe
Rigollet, Austin J. Stromme
- Abstract summary: We propose a class of diffusions called Newton-Langevin diffusions and prove that they converge to stationarity exponentially fast.
We give an application to the problem of sampling from the uniform distribution on a convex body using a strategy inspired by interior-point methods.
- Score: 16.012656579770827
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Motivated by the problem of sampling from ill-conditioned log-concave
distributions, we give a clean non-asymptotic convergence analysis of
mirror-Langevin diffusions as introduced in Zhang et al. (2020). As a special
case of this framework, we propose a class of diffusions called Newton-Langevin
diffusions and prove that they converge to stationarity exponentially fast with
a rate which not only is dimension-free, but also has no dependence on the
target distribution. We give an application of this result to the problem of
sampling from the uniform distribution on a convex body using a strategy
inspired by interior-point methods. Our general approach follows the recent
trend of linking sampling and optimization and highlights the role of the
chi-squared divergence. In particular, it yields new results on the convergence
of the vanilla Langevin diffusion in Wasserstein distance.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Convergence of Score-Based Discrete Diffusion Models: A Discrete-Time Analysis [56.442307356162864]
We study the theoretical aspects of score-based discrete diffusion models under the Continuous Time Markov Chain (CTMC) framework.
We introduce a discrete-time sampling algorithm in the general state space $[S]d$ that utilizes score estimators at predefined time points.
Our convergence analysis employs a Girsanov-based method and establishes key properties of the discrete score function.
arXiv Detail & Related papers (2024-10-03T09:07:13Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems.
We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - New algorithms for sampling and diffusion models [0.0]
We introduce a novel sampling method for known distributions and a new algorithm for diffusion generative models with unknown distributions.
Our approach is inspired by the concept of the reverse diffusion process, widely adopted in diffusion generative models.
arXiv Detail & Related papers (2024-06-14T02:30:04Z) - Unraveling the Smoothness Properties of Diffusion Models: A Gaussian Mixture Perspective [18.331374727331077]
We provide a theoretical understanding of the Lipschitz continuity and second momentum properties of the diffusion process.
Our results provide deeper theoretical insights into the dynamics of the diffusion process under common data distributions.
arXiv Detail & Related papers (2024-05-26T03:32:27Z) - An Improved Analysis of Langevin Algorithms with Prior Diffusion for
Non-Log-Concave Sampling [27.882407333690267]
We show that modified Langevin algorithm with prior diffusion can converge dimension independently for strongly log-concave target distributions.
We also prove that the modified Langevin algorithm can also obtain the dimension-independent convergence of KL divergence with different step size schedules.
arXiv Detail & Related papers (2024-03-10T11:50:34Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Symmetric Mean-field Langevin Dynamics for Distributional Minimax
Problems [78.96969465641024]
We extend mean-field Langevin dynamics to minimax optimization over probability distributions for the first time with symmetric and provably convergent updates.
We also study time and particle discretization regimes and prove a new uniform-in-time propagation of chaos result.
arXiv Detail & Related papers (2023-12-02T13:01:29Z) - Independent projections of diffusions: Gradient flows for variational inference and optimal mean field approximations [0.0]
This paper presents a construction, called the emphindependent projection, which is optimal for two natural criteria.
First, when the original diffusion is reversible with invariant measure $rho_*$, the independent projection serves as the Wasserstein gradient flow for the relative entropy $H(cdot,|,rho_*)$ constrained to the space of product measures.
Second, among all processes with independent coordinates, the independent projection is shown to exhibit the slowest growth rate of path-space entropy relative to the original diffusion.
arXiv Detail & Related papers (2023-09-23T10:33:59Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Efficient constrained sampling via the mirror-Langevin algorithm [9.061408029414455]
We propose a new discretization of the mirror-Langevin diffusion and give a crisp proof of its convergence.
For the task of sampling from a log-concave distribution supported on a compact set, our theoretical results are significantly better than the existing guarantees.
arXiv Detail & Related papers (2020-10-30T11:54:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.