Score-Based Generative Modeling through Stochastic Differential
Equations
- URL: http://arxiv.org/abs/2011.13456v2
- Date: Wed, 10 Feb 2021 18:17:04 GMT
- Title: Score-Based Generative Modeling through Stochastic Differential
Equations
- Authors: Yang Song, Jascha Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar,
Stefano Ermon and Ben Poole
- Abstract summary: We present a differential equation that transforms a complex data distribution to a known prior distribution by injecting noise.
A corresponding reverse-time SDE transforms the prior distribution back into the data distribution by slowly removing the noise.
By leveraging advances in score-based generative modeling, we can accurately estimate these scores with neural networks.
We demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.
- Score: 114.39209003111723
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Creating noise from data is easy; creating data from noise is generative
modeling. We present a stochastic differential equation (SDE) that smoothly
transforms a complex data distribution to a known prior distribution by slowly
injecting noise, and a corresponding reverse-time SDE that transforms the prior
distribution back into the data distribution by slowly removing the noise.
Crucially, the reverse-time SDE depends only on the time-dependent gradient
field (\aka, score) of the perturbed data distribution. By leveraging advances
in score-based generative modeling, we can accurately estimate these scores
with neural networks, and use numerical SDE solvers to generate samples. We
show that this framework encapsulates previous approaches in score-based
generative modeling and diffusion probabilistic modeling, allowing for new
sampling procedures and new modeling capabilities. In particular, we introduce
a predictor-corrector framework to correct errors in the evolution of the
discretized reverse-time SDE. We also derive an equivalent neural ODE that
samples from the same distribution as the SDE, but additionally enables exact
likelihood computation, and improved sampling efficiency. In addition, we
provide a new way to solve inverse problems with score-based models, as
demonstrated with experiments on class-conditional generation, image
inpainting, and colorization. Combined with multiple architectural
improvements, we achieve record-breaking performance for unconditional image
generation on CIFAR-10 with an Inception score of 9.89 and FID of 2.20, a
competitive likelihood of 2.99 bits/dim, and demonstrate high fidelity
generation of 1024 x 1024 images for the first time from a score-based
generative model.
Related papers
- Score-based Generative Models with Adaptive Momentum [40.84399531998246]
We propose an adaptive momentum sampling method to accelerate the transforming process.
We show that our method can produce more faithful images/graphs in small sampling steps with 2 to 5 times speed up.
arXiv Detail & Related papers (2024-05-22T15:20:27Z) - Noise in the reverse process improves the approximation capabilities of
diffusion models [27.65800389807353]
In Score based Generative Modeling (SGMs), the state-of-the-art in generative modeling, reverse processes are known to perform better than their deterministic counterparts.
This paper delves into the heart of this phenomenon, comparing neural ordinary differential equations (ODEs) and neural dimension equations (SDEs) as reverse processes.
We analyze the ability of neural SDEs to approximate trajectories of the Fokker-Planck equation, revealing the advantages of neurality.
arXiv Detail & Related papers (2023-12-13T02:39:10Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Score-based diffusion models for accelerated MRI [35.3148116010546]
We introduce a way to sample data from a conditional distribution given the measurements, such that the model can be readily used for solving inverse problems in imaging.
Our model requires magnitude images only for training, and yet is able to reconstruct complex-valued data, and even extends to parallel imaging.
arXiv Detail & Related papers (2021-10-08T08:42:03Z) - A Variational Perspective on Diffusion-Based Generative Models and Score
Matching [8.93483643820767]
We derive a variational framework for likelihood estimation for continuous-time generative diffusion.
We show that minimizing the score-matching loss is equivalent to maximizing a lower bound of the likelihood of the plug-in reverse SDE.
arXiv Detail & Related papers (2021-06-05T05:50:36Z) - Denoising Diffusion Probabilistic Models [91.94962645056896]
We present high quality image synthesis results using diffusion probabilistic models.
Our best results are obtained by training on a weighted variational bound designed according to a novel connection between diffusion probabilistic models and denoising score matching with Langevin dynamics.
arXiv Detail & Related papers (2020-06-19T17:24:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.