Error Propagation and Model Collapse in Diffusion Models: A Theoretical Study
- URL: http://arxiv.org/abs/2602.16601v1
- Date: Wed, 18 Feb 2026 16:56:36 GMT
- Title: Error Propagation and Model Collapse in Diffusion Models: A Theoretical Study
- Authors: Nail B. Khelifa, Richard E. Turner, Ramji Venkataramanan,
- Abstract summary: Recursively training on synthetic data has been observed to significantly degrade performance in a wide range of tasks.<n>We theoretically analyze this phenomenon in the setting of score-based diffusion models.
- Score: 27.894241484593735
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine learning models are increasingly trained or fine-tuned on synthetic data. Recursively training on such data has been observed to significantly degrade performance in a wide range of tasks, often characterized by a progressive drift away from the target distribution. In this work, we theoretically analyze this phenomenon in the setting of score-based diffusion models. For a realistic pipeline where each training round uses a combination of synthetic data and fresh samples from the target distribution, we obtain upper and lower bounds on the accumulated divergence between the generated and target distributions. This allows us to characterize different regimes of drift, depending on the score estimation error and the proportion of fresh data used in each generation. We also provide empirical results on synthetic data and images to illustrate the theory.
Related papers
- From Collapse to Improvement: Statistical Perspectives on the Evolutionary Dynamics of Iterative Training on Contaminated Sources [2.8647133890966994]
This paper looks at the problem of model collapse from a statistical viewpoint.<n>We consider iterative training on samples sourced from a mixture of the true target and synthetic distributions.<n>With non-token mixture weight of the true distribution, even if it decays over time, simply training the model in a contamination-agnostic manner can avoid collapse.
arXiv Detail & Related papers (2026-02-11T05:01:46Z) - Diffusion models under low-noise regime [3.729242965449096]
We show that diffusion models are effective denoisers when the corruption level is small.<n>We quantify how training set size, data geometry, and model objective choice shape denoising trajectories.<n>This work starts to address gaps in our understanding of generative model reliability in practical applications.
arXiv Detail & Related papers (2025-06-09T15:07:16Z) - Diffusion Attribution Score: Evaluating Training Data Influence in Diffusion Models [22.39558434131574]
Existing data attribution methods for diffusion models typically quantify the contribution of a training sample.<n>We argue that the direct usage of diffusion loss cannot represent such a contribution accurately due to the calculation of diffusion loss.<n>We propose Diffusion Attribution Score (textitDAS) to measure the direct comparison between predicted distributions with an attribution score.
arXiv Detail & Related papers (2024-10-24T10:58:17Z) - Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional dependencies for general score-mismatched diffusion samplers.<n>We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.<n>This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Constrained Diffusion Models via Dual Training [80.03953599062365]
Diffusion processes are prone to generating samples that reflect biases in a training dataset.
We develop constrained diffusion models by imposing diffusion constraints based on desired distributions.
We show that our constrained diffusion models generate new data from a mixture data distribution that achieves the optimal trade-off among objective and constraints.
arXiv Detail & Related papers (2024-08-27T14:25:42Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Training Class-Imbalanced Diffusion Model Via Overlap Optimization [55.96820607533968]
Diffusion models trained on real-world datasets often yield inferior fidelity for tail classes.
Deep generative models, including diffusion models, are biased towards classes with abundant training images.
We propose a method based on contrastive learning to minimize the overlap between distributions of synthetic images for different classes.
arXiv Detail & Related papers (2024-02-16T16:47:21Z) - On the Limitation of Diffusion Models for Synthesizing Training Datasets [5.384630221560811]
This paper investigates the gap between synthetic and real samples by analyzing the synthetic samples reconstructed from real samples through the diffusion and reverse process.
We found that the synthetic datasets degrade classification performance over real datasets even when using state-of-the-art diffusion models.
arXiv Detail & Related papers (2023-11-22T01:42:23Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.