The Entropic Signature of Class Speciation in Diffusion Models
- URL: http://arxiv.org/abs/2602.09651v1
- Date: Tue, 10 Feb 2026 10:56:46 GMT
- Title: The Entropic Signature of Class Speciation in Diffusion Models
- Authors: Florian Handke, Dejan Stančević, Felix Koulischer, Thomas Demeester, Luca Ambrogioni,
- Abstract summary: We show that tracking the class-conditional entropy of a latent semantic variable given a noisy state provides a reliable signature of transition regimes.<n>We validate our method on EDM2-XS and Stable Diffusion 1.5, where class-conditional entropy consistently isolates the noise regimes critical for semantic structure formation.
- Score: 12.212582407125089
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion models do not recover semantic structure uniformly over time. Instead, samples transition from semantic ambiguity to class commitment within a narrow regime. Recent theoretical work attributes this transition to dynamical instabilities along class-separating directions, but practical methods to detect and exploit these windows in trained models are still limited. We show that tracking the class-conditional entropy of a latent semantic variable given the noisy state provides a reliable signature of these transition regimes. By restricting the entropy to semantic partitions, the entropy can furthermore resolve semantic decisions at different levels of abstraction. We analyze this behavior in high-dimensional Gaussian mixture models and show that the entropy rate concentrates on the same logarithmic time scale as the speciation symmetry-breaking instability previously identified in variance-preserving diffusion. We validate our method on EDM2-XS and Stable Diffusion 1.5, where class-conditional entropy consistently isolates the noise regimes critical for semantic structure formation. Finally, we use our framework to quantify how guidance redistributes semantic information over time. Together, these results connect information-theoretic and statistical physics perspectives on diffusion and provide a principled basis for time-localized control.
Related papers
- Latent-Variable Learning of SPDEs via Wiener Chaos [2.0901018134712297]
We study the problem of learning the law of linear partial differential equations (SPDEs) with additive Gaussian forcing from observations.<n>Our approach combines a spectral Galerkin projection with a truncated Wiener chaos expansion, yielding a separation between evolution and forcing domains.<n>This reduces the infinite-dimensional deterministic SPDE to a finite system of parametrized ordinary differential equations governing latent temporal dynamics.
arXiv Detail & Related papers (2026-02-12T10:19:43Z) - Breaking the Bottlenecks: Scalable Diffusion Models for 3D Molecular Generation [0.0]
Diffusion models have emerged as a powerful class of generative models for molecular design.<n>Their use remains constrained by long sampling trajectories, variance in the reverse process, and limited structural awareness in denoising dynamics.<n>The Directly Denoising Diffusion Model mitigates these inefficiencies by replacing reverse MCMC updates with deterministic denoising step.
arXiv Detail & Related papers (2026-01-13T20:09:44Z) - Foundations of Diffusion Models in General State Spaces: A Self-Contained Introduction [54.95522167029998]
This article is a self-contained primer on diffusion over general state spaces.<n>We develop the discrete-time view (forward noising via Markov kernels and learned reverse dynamics) alongside its continuous-time limits.<n>A common variational treatment yields the ELBO that underpins standard training losses.
arXiv Detail & Related papers (2025-12-04T18:55:36Z) - Diffusion Models: A Mathematical Introduction [3.8673630752805437]
We present a self-contained derivation of diffusion-based generative models.<n>We construct denoising diffusion probabilistic models from first principles.<n>Readers can both follow the theory and implement the corresponding algorithms in practice.
arXiv Detail & Related papers (2025-11-13T16:20:52Z) - A Free Probabilistic Framework for Denoising Diffusion Models: Entropy, Transport, and Reverse Processes [22.56299060022639]
This paper builds on Voiculescu's theory of free entropy and free Fisher information.<n>We formulate diffusion and quantify reverse processes governed by operator-valued dynamics.<n>The resulting dynamics admit a gradient-flow structure in the noncommutative Wasserstein space.
arXiv Detail & Related papers (2025-10-26T18:03:54Z) - Continuously Augmented Discrete Diffusion model for Categorical Generative Modeling [87.34677262370924]
Standard discrete diffusion models treat all unobserved states identically by mapping them to an absorbing [MASK] token.<n>This creates an 'information void' where semantic information that could be inferred from unmasked tokens is lost between denoising steps.<n>We introduce Continuously Augmented Discrete Diffusion, a framework that augments the discrete state space with a paired diffusion in a continuous latent space.
arXiv Detail & Related papers (2025-10-01T18:00:56Z) - Measuring Semantic Information Production in Generative Diffusion Models [4.133545163830844]
It is well known that semantic and structural features emerge at different times during the reverse dynamics of diffusion.<n>We introduce a general information-theoretic approach to measure when these class-semantic "decisions" are made during the generative process.
arXiv Detail & Related papers (2025-06-12T07:35:29Z) - Dynamical Diffusion: Learning Temporal Dynamics with Diffusion Models [71.63194926457119]
We introduce Dynamical Diffusion (DyDiff), a theoretically sound framework that incorporates temporally aware forward and reverse processes.<n>Experiments across scientifictemporal forecasting, video prediction, and time series forecasting demonstrate that Dynamical Diffusion consistently improves performance in temporal predictive tasks.
arXiv Detail & Related papers (2025-03-02T16:10:32Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Learning Conditional Invariance through Cycle Consistency [60.85059977904014]
We propose a novel approach to identify meaningful and independent factors of variation in a dataset.
Our method involves two separate latent subspaces for the target property and the remaining input information.
We demonstrate on synthetic and molecular data that our approach identifies more meaningful factors which lead to sparser and more interpretable models.
arXiv Detail & Related papers (2021-11-25T17:33:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.