Renormalizing Diffusion Models
- URL: http://arxiv.org/abs/2308.12355v2
- Date: Tue, 5 Sep 2023 20:50:26 GMT
- Title: Renormalizing Diffusion Models
- Authors: Jordan Cotler, Semon Rezchikov
- Abstract summary: We use diffusion models to learn inverse renormalization group flows of statistical and quantum field theories.
Our work provides an interpretation of multiscale diffusion models, and gives physically-inspired suggestions for diffusion models which should have novel properties.
- Score: 0.7252027234425334
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We explain how to use diffusion models to learn inverse renormalization group
flows of statistical and quantum field theories. Diffusion models are a class
of machine learning models which have been used to generate samples from
complex distributions, such as the distribution of natural images. These models
achieve sample generation by learning the inverse process to a diffusion
process which adds noise to the data until the distribution of the data is pure
noise. Nonperturbative renormalization group schemes in physics can naturally
be written as diffusion processes in the space of fields. We combine these
observations in a concrete framework for building ML-based models for studying
field theories, in which the models learn the inverse process to an
explicitly-specified renormalization group scheme. We detail how these models
define a class of adaptive bridge (or parallel tempering) samplers for lattice
field theory. Because renormalization group schemes have a physical meaning, we
provide explicit prescriptions for how to compare results derived from models
associated to several different renormalization group schemes of interest. We
also explain how to use diffusion models in a variational method to find ground
states of quantum systems. We apply some of our methods to numerically find RG
flows of interacting statistical field theories. From the perspective of
machine learning, our work provides an interpretation of multiscale diffusion
models, and gives physically-inspired suggestions for diffusion models which
should have novel properties.
Related papers
- Constrained Diffusion Models via Dual Training [80.03953599062365]
Diffusion processes are prone to generating samples that reflect biases in a training dataset.
We develop constrained diffusion models by imposing diffusion constraints based on desired distributions.
We show that our constrained diffusion models generate new data from a mixture data distribution that achieves the optimal trade-off among objective and constraints.
arXiv Detail & Related papers (2024-08-27T14:25:42Z) - An Overview of Diffusion Models: Applications, Guided Generation, Statistical Rates and Optimization [59.63880337156392]
Diffusion models have achieved tremendous success in computer vision, audio, reinforcement learning, and computational biology.
Despite the significant empirical success, theory of diffusion models is very limited.
This paper provides a well-rounded theoretical exposure for stimulating forward-looking theories and methods of diffusion models.
arXiv Detail & Related papers (2024-04-11T14:07:25Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Lecture Notes in Probabilistic Diffusion Models [0.5361320134021585]
Diffusion models are loosely modelled based on non-equilibrium thermodynamics.
The diffusion model learns the data manifold to which the original and thus the reconstructed data samples belong.
Diffusion models have -- unlike variational autoencoder and flow models -- latent variables with the same dimensionality as the original data.
arXiv Detail & Related papers (2023-12-16T09:36:54Z) - The Emergence of Reproducibility and Generalizability in Diffusion Models [10.188731323681575]
Given the same starting noise input and a deterministic sampler, different diffusion models often yield remarkably similar outputs.
We show that diffusion models are learning distinct distributions affected by the training data size.
This valuable property generalizes to many variants of diffusion models, including those for conditional use, solving inverse problems, and model fine-tuning.
arXiv Detail & Related papers (2023-10-08T19:02:46Z) - Infinite-Dimensional Diffusion Models [4.342241136871849]
We formulate diffusion-based generative models in infinite dimensions and apply them to the generative modeling of functions.
We show that our formulations are well posed in the infinite-dimensional setting and provide dimension-independent distance bounds from the sample to the target measure.
We also develop guidelines for the design of infinite-dimensional diffusion models.
arXiv Detail & Related papers (2023-02-20T18:00:38Z) - Information-Theoretic Diffusion [18.356162596599436]
Denoising diffusion models have spurred significant gains in density modeling and image generation.
We introduce a new mathematical foundation for diffusion models inspired by classic results in information theory.
arXiv Detail & Related papers (2023-02-07T23:03:07Z) - Unifying Diffusion Models' Latent Space, with Applications to
CycleDiffusion and Guidance [95.12230117950232]
We show that a common latent space emerges from two diffusion models trained independently on related domains.
Applying CycleDiffusion to text-to-image diffusion models, we show that large-scale text-to-image diffusion models can be used as zero-shot image-to-image editors.
arXiv Detail & Related papers (2022-10-11T15:53:52Z) - Diffusion Models in Vision: A Survey [80.82832715884597]
A diffusion model is a deep generative model that is based on two stages, a forward diffusion stage and a reverse diffusion stage.
Diffusion models are widely appreciated for the quality and diversity of the generated samples, despite their known computational burdens.
arXiv Detail & Related papers (2022-09-10T22:00:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.