On Independent Samples Along the Langevin Diffusion and the Unadjusted
Langevin Algorithm
- URL: http://arxiv.org/abs/2402.17067v1
- Date: Mon, 26 Feb 2024 23:05:02 GMT
- Title: On Independent Samples Along the Langevin Diffusion and the Unadjusted
Langevin Algorithm
- Authors: Jiaming Liang, Siddharth Mitra, Andre Wibisono
- Abstract summary: We study the rate at which the initial and current random variables become independent along a Markov chain.
We focus on the Langevin diffusion in continuous time and the Unadjusted Langevin Algorithm (ULA) in discrete time.
- Score: 18.595570786973948
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the rate at which the initial and current random variables become
independent along a Markov chain, focusing on the Langevin diffusion in
continuous time and the Unadjusted Langevin Algorithm (ULA) in discrete time.
We measure the dependence between random variables via their mutual
information. For the Langevin diffusion, we show the mutual information
converges to $0$ exponentially fast when the target is strongly log-concave,
and at a polynomial rate when the target is weakly log-concave. These rates are
analogous to the mixing time of the Langevin diffusion under similar
assumptions. For the ULA, we show the mutual information converges to $0$
exponentially fast when the target is strongly log-concave and smooth. We prove
our results by developing the mutual version of the mixing time analyses of
these Markov chains. We also provide alternative proofs based on strong data
processing inequalities for the Langevin diffusion and the ULA, and by showing
regularity results for these processes in mutual information.
Related papers
- An Improved Analysis of Langevin Algorithms with Prior Diffusion for
Non-Log-Concave Sampling [27.882407333690267]
We show that modified Langevin algorithm with prior diffusion can converge dimension independently for strongly log-concave target distributions.
We also prove that the modified Langevin algorithm can also obtain the dimension-independent convergence of KL divergence with different step size schedules.
arXiv Detail & Related papers (2024-03-10T11:50:34Z) - Time Series Diffusion in the Frequency Domain [54.60573052311487]
We analyze whether representing time series in the frequency domain is a useful inductive bias for score-based diffusion models.
We show that a dual diffusion process occurs in the frequency domain with an important nuance.
We show how to adapt the denoising score matching approach to implement diffusion models in the frequency domain.
arXiv Detail & Related papers (2024-02-08T18:59:05Z) - Symmetric Mean-field Langevin Dynamics for Distributional Minimax
Problems [78.96969465641024]
We extend mean-field Langevin dynamics to minimax optimization over probability distributions for the first time with symmetric and provably convergent updates.
We also study time and particle discretization regimes and prove a new uniform-in-time propagation of chaos result.
arXiv Detail & Related papers (2023-12-02T13:01:29Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Subsampling Error in Stochastic Gradient Langevin Diffusions [0.7900303955225063]
The Gradient Langevin Dynamics (SGLD) are popularly used to approximate Bayesian posterior distributions in statistical learning procedures with large-scale data.
The first error is introduced by an Euler--Maruyama discretisation of a Langevin diffusion process, the second error comes from the data subsampling that enables its use in large-scale data settings.
We show the exponential ergodicity of SLGDiff and that the Wasserstein distance between the posterior and the limiting distribution of SGLDiff is bounded above by a fractional power of the mean waiting time.
arXiv Detail & Related papers (2023-05-23T10:03:40Z) - Resolving the Mixing Time of the Langevin Algorithm to its Stationary
Distribution for Log-Concave Sampling [34.66940399825547]
This paper characterizes the mixing time of the Langevin Algorithm to its stationary distribution.
We introduce a technique from the differential privacy literature to the sampling literature.
arXiv Detail & Related papers (2022-10-16T05:11:16Z) - Diffusion-GAN: Training GANs with Diffusion [135.24433011977874]
Generative adversarial networks (GANs) are challenging to train stably.
We propose Diffusion-GAN, a novel GAN framework that leverages a forward diffusion chain to generate instance noise.
We show that Diffusion-GAN can produce more realistic images with higher stability and data efficiency than state-of-the-art GANs.
arXiv Detail & Related papers (2022-06-05T20:45:01Z) - Time-inhomogeneous diffusion geometry and topology [69.55228523791897]
Diffusion condensation is a time-inhomogeneous process where each step first computes and then applies a diffusion operator to the data.
We theoretically analyze the convergence and evolution of this process from geometric, spectral, and topological perspectives.
Our work gives theoretical insights into the convergence of diffusion condensation, and shows that it provides a link between topological and geometric data analysis.
arXiv Detail & Related papers (2022-03-28T16:06:17Z) - Spread of Correlations in Strongly Disordered Lattice Systems with
Long-Range Coupling [0.0]
We investigate the spread of correlations carried by an excitation in a 1-dimensional lattice system with high on-site energy disorder.
The increase in correlation between the initially quenched node and a given node exhibits three phases: quadratic in time, linear in time, and saturation.
arXiv Detail & Related papers (2021-06-15T15:47:20Z) - Faster Convergence of Stochastic Gradient Langevin Dynamics for
Non-Log-Concave Sampling [110.88857917726276]
We provide a new convergence analysis of gradient Langevin dynamics (SGLD) for sampling from a class of distributions that can be non-log-concave.
At the core of our approach is a novel conductance analysis of SGLD using an auxiliary time-reversible Markov Chain.
arXiv Detail & Related papers (2020-10-19T15:23:18Z) - Exponential ergodicity of mirror-Langevin diffusions [16.012656579770827]
We propose a class of diffusions called Newton-Langevin diffusions and prove that they converge to stationarity exponentially fast.
We give an application to the problem of sampling from the uniform distribution on a convex body using a strategy inspired by interior-point methods.
arXiv Detail & Related papers (2020-05-19T18:00:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.