Spectral Analysis of Diffusion Models with Application to Schedule Design
- URL: http://arxiv.org/abs/2502.00180v2
- Date: Sat, 31 May 2025 06:33:54 GMT
- Title: Spectral Analysis of Diffusion Models with Application to Schedule Design
- Authors: Roi Benita, Michael Elad, Joseph Keshet,
- Abstract summary: Diffusion models (DMs) have emerged as powerful tools for modeling complex data distributions.<n>We offer a novel analysis of the DM's inference process, introducing a comprehensive frequency response perspective.<n>We demonstrate how the proposed analysis can be leveraged to design a noise schedule that aligns effectively with the characteristics of the data.
- Score: 23.105365495914644
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Diffusion models (DMs) have emerged as powerful tools for modeling complex data distributions and generating realistic new samples. Over the years, advanced architectures and sampling methods have been developed to make these models practically usable. However, certain synthesis process decisions still rely on heuristics without a solid theoretical foundation. In our work, we offer a novel analysis of the DM's inference process, introducing a comprehensive frequency response perspective. Specifically, by relying on Gaussianity assumption, we present the inference process as a closed-form spectral transfer function, capturing how the generated signal evolves in response to the initial noise. We demonstrate how the proposed analysis can be leveraged to design a noise schedule that aligns effectively with the characteristics of the data. The spectral perspective also provides insights into the underlying dynamics and sheds light on the relationship between spectral properties and noise schedule structure. Our results lead to scheduling curves that are dependent on the spectral content of the data, offering a theoretical justification for some of the heuristics taken by practitioners.
Related papers
- Spectral Regularization for Diffusion Models [14.919876123456747]
We propose a loss-level spectral regularization framework that augments standard diffusion training with differentiable Fourier- and wavelet-domain losses.<n>Our approach is compatible with DDPM, DDIM, and EDM formulations and introduces negligible computational overhead.
arXiv Detail & Related papers (2026-03-02T22:39:02Z) - Analyzing and Guiding Zero-Shot Posterior Sampling in Diffusion Models [28.599984631773093]
We propose a rigorous analysis of such approximate posterior-samplers, relying on a Gaussianity assumption of the prior.<n>We show that both the ideal posterior sampler and diffusion-based reconstruction algorithms can be expressed in closed-form.
arXiv Detail & Related papers (2026-02-07T21:44:52Z) - SIGMA: Scalable Spectral Insights for LLM Collapse [51.863164847253366]
We introduce SIGMA (Spectral Inequalities for Gram Matrix Analysis), a unified framework for model collapse.<n>By utilizing benchmarks that deriving and deterministic bounds on the matrix's spectrum, SIGMA provides a mathematically grounded metric to track the contraction of the representation space.<n>We demonstrate that SIGMA effectively captures the transition towards states, offering both theoretical insights into the mechanics of collapse.
arXiv Detail & Related papers (2026-01-06T19:47:11Z) - Consistent World Models via Foresight Diffusion [56.45012929930605]
We argue that a key bottleneck in learning consistent diffusion-based world models lies in the suboptimal predictive ability.<n>We propose Foresight Diffusion (ForeDiff), a diffusion-based world modeling framework that enhances consistency by decoupling condition understanding from target denoising.
arXiv Detail & Related papers (2025-05-22T10:01:59Z) - Can Diffusion Models Disentangle? A Theoretical Perspective [52.360881354319986]
This paper presents a novel theoretical framework for understanding how diffusion models can learn disentangled representations.<n>We establish identifiability conditions for general disentangled latent variable models, analyze training dynamics, and derive sample complexity bounds for disentangled latent subspace models.
arXiv Detail & Related papers (2025-03-31T20:46:18Z) - Spatial Reasoning with Denoising Models [49.83744014336816]
We introduce a framework to perform reasoning over sets of continuous variables via denoising generative models.
We demonstrate for the first time, that order of generation can successfully be predicted by the denoising network itself.
arXiv Detail & Related papers (2025-02-28T14:08:30Z) - Generalized Diffusion Model with Adjusted Offset Noise [1.7767466724342067]
We propose a generalized diffusion model that naturally incorporates additional noise within a rigorous probabilistic framework.<n>We derive a loss function based on the evidence lower bound, establishing its theoretical equivalence to offset noise with certain adjustments.<n>Experiments on synthetic datasets demonstrate that our model effectively addresses brightness-related challenges and outperforms conventional methods in high-dimensional scenarios.
arXiv Detail & Related papers (2024-12-04T08:57:03Z) - Unified Convergence Analysis for Score-Based Diffusion Models with Deterministic Samplers [49.1574468325115]
We introduce a unified convergence analysis framework for deterministic samplers.
Our framework achieves iteration complexity of $tilde O(d2/epsilon)$.
We also provide a detailed analysis of Denoising Implicit Diffusion Models (DDIM)-type samplers.
arXiv Detail & Related papers (2024-10-18T07:37:36Z) - Diffusion Spectral Representation for Reinforcement Learning [17.701625371409644]
We propose to leverage the flexibility of diffusion models for reinforcement learning from a representation learning perspective.
By exploiting the connection between diffusion models and energy-based models, we develop Diffusion Spectral Representation (Diff-SR)
We show how Diff-SR facilitates efficient policy optimization and practical algorithms while explicitly bypassing the difficulty and inference cost of sampling from the diffusion model.
arXiv Detail & Related papers (2024-06-23T14:24:14Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Robust Estimation of Causal Heteroscedastic Noise Models [7.568978862189266]
Student's $t$-distribution is known for its robustness in accounting for sampling variability with smaller sample sizes and extreme values without significantly altering the overall distribution shape.
Our empirical evaluations demonstrate that our estimators are more robust and achieve better overall performance across synthetic and real benchmarks.
arXiv Detail & Related papers (2023-12-15T02:26:35Z) - Compositional Abilities Emerge Multiplicatively: Exploring Diffusion
Models on a Synthetic Task [20.749514363389878]
We study compositional generalization in conditional diffusion models in a synthetic setting.
We find that the order in which the ability to generate samples emerges is governed by the structure of the underlying data-generating process.
Our study lays a foundation for understanding capabilities and compositionality in generative models from a data-centric perspective.
arXiv Detail & Related papers (2023-10-13T18:00:59Z) - U-Turn Diffusion [0.4527270266697462]
We focus on how GT sample information is encoded in the Score Function (SF)<n>We propose U-Turn diffusion, an augmentation of a pre-trained diffusion model.<n>Experiments on the class-conditioned SF of the ImageNet dataset reveal a critical Memorization Time $ T_m.
arXiv Detail & Related papers (2023-08-14T19:21:28Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Generative structured normalizing flow Gaussian processes applied to
spectroscopic data [4.0773490083614075]
In the physical sciences, limited training data may not adequately characterize future observed data.
It is critical that models adequately indicate uncertainty, particularly when they may be asked to extrapolate.
We demonstrate the methodology on laser-induced breakdown spectroscopy data from the ChemCam instrument onboard the Mars rover Curiosity.
arXiv Detail & Related papers (2022-12-14T23:57:46Z) - Trustworthiness of Laser-Induced Breakdown Spectroscopy Predictions via
Simulation-based Synthetic Data Augmentation and Multitask Learning [4.633997895806144]
We consider quantitative analyses of spectral data using laser-induced breakdown spectroscopy.
We address the small size of training data available, and the validation of the predictions during inference on unknown data.
arXiv Detail & Related papers (2022-10-07T18:00:09Z) - Spectral Decomposition Representation for Reinforcement Learning [100.0424588013549]
We propose an alternative spectral method, Spectral Decomposition Representation (SPEDER), that extracts a state-action abstraction from the dynamics without inducing spurious dependence on the data collection policy.
A theoretical analysis establishes the sample efficiency of the proposed algorithm in both the online and offline settings.
An experimental investigation demonstrates superior performance over current state-of-the-art algorithms across several benchmarks.
arXiv Detail & Related papers (2022-08-19T19:01:30Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.