Wasserstein Convergence of Critically Damped Langevin Diffusions
- URL: http://arxiv.org/abs/2511.02419v1
- Date: Tue, 04 Nov 2025 09:49:07 GMT
- Title: Wasserstein Convergence of Critically Damped Langevin Diffusions
- Authors: Stanislas Strasman, Sobihan Surendran, Claire Boyer, Sylvain Le Corff, Vincent Lemaire, Antonio Ocello,
- Abstract summary: Critically-damped Langevin Diffusions (CLDs) have been shown to outperform standard diffusion-based samplers numerically.<n>We derive a novel upper bound on the sampling error of CLD-based generative models in the Wasserstein metric.
- Score: 9.063081094420044
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Score-based Generative Models (SGMs) have achieved impressive performance in data generation across a wide range of applications and benefit from strong theoretical guarantees. Recently, methods inspired by statistical mechanics, in particular, Hamiltonian dynamics, have introduced Critically-damped Langevin Diffusions (CLDs), which define diffusion processes on extended spaces by coupling the data with auxiliary variables. These approaches, along with their associated score-matching and sampling procedures, have been shown to outperform standard diffusion-based samplers numerically. In this paper, we analyze a generalized dynamic that extends classical CLDs by introducing an additional hyperparameter controlling the noise applied to the data coordinate, thereby better exploiting the extended space. We further derive a novel upper bound on the sampling error of CLD-based generative models in the Wasserstein metric. This additional hyperparameter influences the smoothness of sample paths, and our discretization error analysis provides practical guidance for its tuning, leading to improved sampling performance.
Related papers
- Sharp Convergence Rates for Masked Diffusion Models [53.117058231393834]
We develop a total-variation based analysis for the Euler method that overcomes limitations.<n>Our results relax assumptions on score estimation, improve parameter dependencies, and establish convergence guarantees.<n>Overall, our analysis introduces a direct TV-based error decomposition along the CTMC trajectory and a decoupling-based path-wise analysis for FHS.
arXiv Detail & Related papers (2026-02-26T00:47:51Z) - G$^2$RPO: Granular GRPO for Precise Reward in Flow Models [74.21206048155669]
We propose a novel Granular-GRPO (G$2$RPO) framework that achieves precise and comprehensive reward assessments of sampling directions.<n>We introduce a Multi-Granularity Advantage Integration module that aggregates advantages computed at multiple diffusion scales.<n>Our G$2$RPO significantly outperforms existing flow-based GRPO baselines.
arXiv Detail & Related papers (2025-10-02T12:57:12Z) - A Proportional-Integral Controller-Incorporated SGD Algorithm for High Efficient Latent Factor Analysis [1.4719924357068723]
The gradient descent-based latent factor analysis (SGD-LFA) method can effectively extract deep feature information embedded in HDI matrices.<n>This paper proposes a PILF model by developing a PI-accelerated SGD algorithm by integrating correlated instances and refining learning errors through proportional-integral (PI) control mechanism.
arXiv Detail & Related papers (2025-08-25T02:39:23Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Insights into Closed-form IPM-GAN Discriminator Guidance for Diffusion Modeling [11.68361062474064]
We propose a theoretical framework to analyze the effect of the GAN discriminator on Langevin-based sampling.<n>We show that the proposed approach can be combined with existing accelerated-diffusion techniques to improve latent-space image generation.
arXiv Detail & Related papers (2023-06-02T16:24:07Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Score-Based Generative Modeling with Critically-Damped Langevin
Diffusion [18.82116696636531]
Current score-based generative models (SGMs) rely on a diffusion process that gradually perturbs the data towards a tractable distribution.
We argue that current SGMs employ overly simplistic diffusions, leading to unnecessarily complex denoising processes.
We propose a novel critically-damped Langevin diffusion (CLD) and show that CLD-based SGMs achieve superior performance.
arXiv Detail & Related papers (2021-12-14T00:01:34Z) - Generative Modeling with Denoising Auto-Encoders and Langevin Sampling [88.83704353627554]
We show that both DAE and DSM provide estimates of the score of the smoothed population density.
We then apply our results to the homotopy method of arXiv:1907.05600 and provide theoretical justification for its empirical success.
arXiv Detail & Related papers (2020-01-31T23:50:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.