Structure Preserving Diffusion Models
- URL: http://arxiv.org/abs/2402.19369v1
- Date: Thu, 29 Feb 2024 17:16:20 GMT
- Title: Structure Preserving Diffusion Models
- Authors: Haoye Lu, Spencer Szabados, Yaoliang Yu
- Abstract summary: We introduce a family of diffusion processes for learning distributions that possess additional structure, such as group symmetries.
We exemplify these results by developing a collection of different symmetry equivariant diffusion models capable of learning distributions that are inherently symmetric.
We show how the proposed models can be used to achieve theoretically guaranteed equivariant image noise reduction without prior knowledge of the image orientation.
- Score: 21.774891092908945
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Diffusion models have become the leading distribution-learning method in
recent years. Herein, we introduce structure-preserving diffusion processes, a
family of diffusion processes for learning distributions that possess
additional structure, such as group symmetries, by developing theoretical
conditions under which the diffusion transition steps preserve said symmetry.
While also enabling equivariant data sampling trajectories, we exemplify these
results by developing a collection of different symmetry equivariant diffusion
models capable of learning distributions that are inherently symmetric.
Empirical studies, over both synthetic and real-world datasets, are used to
validate the developed models adhere to the proposed theory and are capable of
achieving improved performance over existing methods in terms of sample
equality. We also show how the proposed models can be used to achieve
theoretically guaranteed equivariant image noise reduction without prior
knowledge of the image orientation.
Related papers
- Continuous Diffusion Model for Language Modeling [57.396578974401734]
Existing continuous diffusion models for discrete data have limited performance compared to discrete approaches.
We propose a continuous diffusion model for language modeling that incorporates the geometry of the underlying categorical distribution.
arXiv Detail & Related papers (2025-02-17T08:54:29Z) - Symmetry-Preserving Diffusion Models via Target Symmetrization [43.83899968118655]
We propose a novel approach that enforces equivariance through a symmetrized loss function.
Our method uses Monte Carlo sampling to estimate the average, incurring minimal computational overhead.
Experiments show improved sample quality compared to existing methods.
arXiv Detail & Related papers (2025-02-14T03:26:57Z) - Diffusion Models Learn Low-Dimensional Distributions via Subspace Clustering [15.326641037243006]
diffusion models can effectively learn the image distribution and generate new samples.
We provide theoretical insights into this phenomenon by leveraging key empirical observations.
We show that the minimal number of samples required to learn the underlying distribution scales linearly with the intrinsic dimensions.
arXiv Detail & Related papers (2024-09-04T04:14:02Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Convergence Analysis of Discrete Diffusion Model: Exact Implementation
through Uniformization [17.535229185525353]
We introduce an algorithm leveraging the uniformization of continuous Markov chains, implementing transitions on random time points.
Our results align with state-of-the-art achievements for diffusion models in $mathbbRd$ and further underscore the advantages of discrete diffusion models in comparison to the $mathbbRd$ setting.
arXiv Detail & Related papers (2024-02-12T22:26:52Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Infinite-Dimensional Diffusion Models [4.342241136871849]
We formulate diffusion-based generative models in infinite dimensions and apply them to the generative modeling of functions.
We show that our formulations are well posed in the infinite-dimensional setting and provide dimension-independent distance bounds from the sample to the target measure.
We also develop guidelines for the design of infinite-dimensional diffusion models.
arXiv Detail & Related papers (2023-02-20T18:00:38Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - An optimal control perspective on diffusion-based generative modeling [9.806130366152194]
We establish a connection between optimal control and generative models based on differential equations (SDEs)
In particular, we derive a Hamilton-Jacobi-Bellman equation that governs the evolution of the log-densities of the underlying SDE marginals.
We develop a novel diffusion-based method for sampling from unnormalized densities.
arXiv Detail & Related papers (2022-11-02T17:59:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.