Hierarchically branched diffusion models leverage dataset structure for
class-conditional generation
- URL: http://arxiv.org/abs/2212.10777v4
- Date: Thu, 1 Feb 2024 23:34:04 GMT
- Title: Hierarchically branched diffusion models leverage dataset structure for
class-conditional generation
- Authors: Alex M. Tseng, Max Shen, Tommaso Biancalani, Gabriele Scalia
- Abstract summary: Branched diffusion models rely on the same diffusion process as traditional models, but learn reverse diffusion separately for each branch of a hierarchy.
We extensively evaluate branched diffusion models on several benchmark and large real-world scientific datasets spanning many data modalities.
- Score: 0.6800113478497425
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Class-labeled datasets, particularly those common in scientific domains, are
rife with internal structure, yet current class-conditional diffusion models
ignore these relationships and implicitly diffuse on all classes in a flat
fashion. To leverage this structure, we propose hierarchically branched
diffusion models as a novel framework for class-conditional generation.
Branched diffusion models rely on the same diffusion process as traditional
models, but learn reverse diffusion separately for each branch of a hierarchy.
We highlight several advantages of branched diffusion models over the current
state-of-the-art methods for class-conditional diffusion, including extension
to novel classes in a continual-learning setting, a more sophisticated form of
analogy-based conditional generation (i.e. transmutation), and a novel
interpretability into the generation process. We extensively evaluate branched
diffusion models on several benchmark and large real-world scientific datasets
spanning many data modalities.
Related papers
- Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - A Phase Transition in Diffusion Models Reveals the Hierarchical Nature
of Data [55.748186000425996]
Recent advancements show that diffusion models can generate high-quality images.
We study this phenomenon in a hierarchical generative model of data.
Our analysis characterises the relationship between time and scale in diffusion models.
arXiv Detail & Related papers (2024-02-26T19:52:33Z) - Training Class-Imbalanced Diffusion Model Via Overlap Optimization [55.96820607533968]
Diffusion models trained on real-world datasets often yield inferior fidelity for tail classes.
Deep generative models, including diffusion models, are biased towards classes with abundant training images.
We propose a method based on contrastive learning to minimize the overlap between distributions of synthetic images for different classes.
arXiv Detail & Related papers (2024-02-16T16:47:21Z) - Navigating the Structured What-If Spaces: Counterfactual Generation via
Structured Diffusion [20.20945739504847]
We introduce Structured Counterfactual diffuser or SCD, the first plug-and-play framework leveraging diffusion for generating counterfactual explanations in structured data.
Our experiments show that our counterfactuals not only exhibit high plausibility compared to the existing state-of-the-art but also show significantly better proximity and diversity.
arXiv Detail & Related papers (2023-12-21T07:05:21Z) - Guided Diffusion from Self-Supervised Diffusion Features [49.78673164423208]
Guidance serves as a key concept in diffusion models, yet its effectiveness is often limited by the need for extra data annotation or pretraining.
We propose a framework to extract guidance from, and specifically for, diffusion models.
arXiv Detail & Related papers (2023-12-14T11:19:11Z) - The Emergence of Reproducibility and Generalizability in Diffusion Models [10.188731323681575]
Given the same starting noise input and a deterministic sampler, different diffusion models often yield remarkably similar outputs.
We show that diffusion models are learning distinct distributions affected by the training data size.
This valuable property generalizes to many variants of diffusion models, including those for conditional use, solving inverse problems, and model fine-tuning.
arXiv Detail & Related papers (2023-10-08T19:02:46Z) - Renormalizing Diffusion Models [0.7252027234425334]
We use diffusion models to learn inverse renormalization group flows of statistical and quantum field theories.
Our work provides an interpretation of multiscale diffusion models, and gives physically-inspired suggestions for diffusion models which should have novel properties.
arXiv Detail & Related papers (2023-08-23T18:02:31Z) - A Survey of Diffusion Models in Natural Language Processing [11.233768932957771]
Diffusion models capture the diffusion of information or signals across a network or manifold.
This paper discusses the different formulations of diffusion models used in NLP, their strengths and limitations, and their applications.
arXiv Detail & Related papers (2023-05-24T03:25:32Z) - Diffusion Models in Vision: A Survey [80.82832715884597]
A diffusion model is a deep generative model that is based on two stages, a forward diffusion stage and a reverse diffusion stage.
Diffusion models are widely appreciated for the quality and diversity of the generated samples, despite their known computational burdens.
arXiv Detail & Related papers (2022-09-10T22:00:30Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.