Resolving Memorization in Empirical Diffusion Model for Manifold Data in High-Dimensional Spaces
- URL: http://arxiv.org/abs/2505.02508v3
- Date: Sat, 02 Aug 2025 09:46:23 GMT
- Title: Resolving Memorization in Empirical Diffusion Model for Manifold Data in High-Dimensional Spaces
- Authors: Yang Lyu, Tan Minh Nguyen, Yuchun Qian, Xin T. Tong,
- Abstract summary: When the data distribution consists of n points, empirical diffusion models tend to reproduce existing data points.<n>This work shows that the memorization issue can be solved simply by applying an inertia update at the end of the empirical diffusion simulation.<n>We demonstrate that the distribution of samples from this model approximates the true data distribution on a $C2$ manifold of dimension $d$, within a Wasserstein-1 distance of order $O(n-frac2d+4)$.
- Score: 5.716752583983991
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion models are popular tools for generating new data samples, using a forward process that adds noise to data and a reverse process to denoise and produce samples. However, when the data distribution consists of n points, empirical diffusion models tend to reproduce existing data points, a phenomenon known as the memorization effect. Current literature often addresses this with complex machine learning techniques. This work shows that the memorization issue can be solved simply by applying an inertia update at the end of the empirical diffusion simulation. Our inertial diffusion model requires only the empirical score function and no additional training. We demonstrate that the distribution of samples from this model approximates the true data distribution on a $C^2$ manifold of dimension $d$, within a Wasserstein-1 distance of order $O(n^{-\frac{2}{d+4}})$. This bound significantly shrinks the Wasserstein distance between the population and empirical distributions, confirming that the inertial diffusion model produces new and diverse samples. Remarkably, this estimate is independent of the ambient space dimension, as no further training is needed. Our analysis shows that the inertial diffusion samples resemble Gaussian kernel density estimations on the manifold, revealing a novel connection between diffusion models and manifold learning.
Related papers
- Multimodal Atmospheric Super-Resolution With Deep Generative Models [0.0]
Score-based diffusion modeling is a generative machine learning algorithm that can be used to sample from complex distributions.<n>In this article, we apply such a concept to the super-resolution of a high-dimensional dynamical system, given the real-time availability of low-resolution and experimentally observed sparse sensor measurements.
arXiv Detail & Related papers (2025-06-28T06:47:09Z) - Progressive Inference-Time Annealing of Diffusion Models for Sampling from Boltzmann Densities [85.83359661628575]
We propose Progressive Inference-Time Annealing (PITA) to learn diffusion-based samplers.<n>PITA combines two complementary techniques: Annealing of the Boltzmann distribution and Diffusion smoothing.<n>It enables equilibrium sampling of N-body particle systems, Alanine Dipeptide, and tripeptides in Cartesian coordinates.
arXiv Detail & Related papers (2025-06-19T17:14:22Z) - Continuous Diffusion Model for Language Modeling [57.396578974401734]
Existing continuous diffusion models for discrete data have limited performance compared to discrete approaches.<n>We propose a continuous diffusion model for language modeling that incorporates the geometry of the underlying categorical distribution.
arXiv Detail & Related papers (2025-02-17T08:54:29Z) - Non-Normal Diffusion Models [3.5534933448684134]
Diffusion models generate samples by incrementally reversing a process that turns data into noise.<n>We show that when the step size goes to zero, the reversed process is invariant to the distribution of these increments.<n>We demonstrate the effectiveness of these models on density estimation and generative modeling tasks on standard image datasets.
arXiv Detail & Related papers (2024-12-10T21:31:12Z) - Convergence of Diffusion Models Under the Manifold Hypothesis in High-Dimensions [6.9408143976091745]
Denoising Diffusion Probabilistic Models (DDPM) are powerful state-of-the-art methods used to generate synthetic data from high-dimensional data distributions.<n>We study DDPMs under the manifold hypothesis and prove that they achieve rates independent of the ambient dimension in terms of score learning.<n>In terms of sampling complexity, we obtain rates independent of the ambient dimension w.r.t. the Kullback-Leibler divergence, and $O(sqrtD)$ w.r.t. the Wasserstein distance.
arXiv Detail & Related papers (2024-09-27T14:57:18Z) - Constrained Diffusion Models via Dual Training [80.03953599062365]
Diffusion processes are prone to generating samples that reflect biases in a training dataset.
We develop constrained diffusion models by imposing diffusion constraints based on desired distributions.
We show that our constrained diffusion models generate new data from a mixture data distribution that achieves the optimal trade-off among objective and constraints.
arXiv Detail & Related papers (2024-08-27T14:25:42Z) - Particle Denoising Diffusion Sampler [32.310922004771776]
Particle Denoising Diffusion Sampler (PDDS) provides consistent estimates under mild assumptions.
We demonstrate PDDS on multimodal and high dimensional sampling tasks.
arXiv Detail & Related papers (2024-02-09T11:01:35Z) - Lecture Notes in Probabilistic Diffusion Models [0.5361320134021585]
Diffusion models are loosely modelled based on non-equilibrium thermodynamics.
The diffusion model learns the data manifold to which the original and thus the reconstructed data samples belong.
Diffusion models have -- unlike variational autoencoder and flow models -- latent variables with the same dimensionality as the original data.
arXiv Detail & Related papers (2023-12-16T09:36:54Z) - Renormalizing Diffusion Models [0.7252027234425334]
We use diffusion models to learn inverse renormalization group flows of statistical and quantum field theories.
Our work provides an interpretation of multiscale diffusion models, and gives physically-inspired suggestions for diffusion models which should have novel properties.
arXiv Detail & Related papers (2023-08-23T18:02:31Z) - On Error Propagation of Diffusion Models [77.91480554418048]
We develop a theoretical framework to mathematically formulate error propagation in the architecture of DMs.
We apply the cumulative error as a regularization term to reduce error propagation.
Our proposed regularization reduces error propagation, significantly improves vanilla DMs, and outperforms previous baselines.
arXiv Detail & Related papers (2023-08-09T15:31:17Z) - Towards Faster Non-Asymptotic Convergence for Diffusion-Based Generative
Models [49.81937966106691]
We develop a suite of non-asymptotic theory towards understanding the data generation process of diffusion models.
In contrast to prior works, our theory is developed based on an elementary yet versatile non-asymptotic approach.
arXiv Detail & Related papers (2023-06-15T16:30:08Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Denoising Diffusion Samplers [41.796349001299156]
Denoising diffusion models are a popular class of generative models providing state-of-the-art results in many domains.
We explore a similar idea to sample approximately from unnormalized probability density functions and estimate their normalizing constants.
While score matching is not applicable in this context, we can leverage many of the ideas introduced in generative modeling for Monte Carlo sampling.
arXiv Detail & Related papers (2023-02-27T14:37:16Z) - Where to Diffuse, How to Diffuse, and How to Get Back: Automated
Learning for Multivariate Diffusions [22.04182099405728]
Diffusion-based generative models (DBGMs) perturb data to a target noise distribution and reverse this inference diffusion process to generate samples.
We show how to maximize a lower-bound on the likelihood for any number of auxiliary variables.
We then demonstrate how to parameterize the diffusion for a specified target noise distribution.
arXiv Detail & Related papers (2023-02-14T18:57:04Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - How Much is Enough? A Study on Diffusion Times in Score-based Generative
Models [76.76860707897413]
Current best practice advocates for a large T to ensure that the forward dynamics brings the diffusion sufficiently close to a known and simple noise distribution.
We show how an auxiliary model can be used to bridge the gap between the ideal and the simulated forward dynamics, followed by a standard reverse diffusion process.
arXiv Detail & Related papers (2022-06-10T15:09:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.