Fast Diffusion Probabilistic Model Sampling through the lens of Backward
Error Analysis
- URL: http://arxiv.org/abs/2304.11446v1
- Date: Sat, 22 Apr 2023 16:58:47 GMT
- Title: Fast Diffusion Probabilistic Model Sampling through the lens of Backward
Error Analysis
- Authors: Yansong Gao, Zhihong Pan, Xin Zhou, Le Kang, Pratik Chaudhari
- Abstract summary: Denoising diffusion probabilistic models (DDPMs) are a class of powerful generative models.
DDPMs generally need hundreds or thousands of sequential function evaluations (steps) of neural networks to generate a sample.
This paper aims to develop a fast sampling method for DDPMs requiring much fewer steps while retaining high sample quality.
- Score: 26.907301901503835
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Denoising diffusion probabilistic models (DDPMs) are a class of powerful
generative models. The past few years have witnessed the great success of DDPMs
in generating high-fidelity samples. A significant limitation of the DDPMs is
the slow sampling procedure. DDPMs generally need hundreds or thousands of
sequential function evaluations (steps) of neural networks to generate a
sample. This paper aims to develop a fast sampling method for DDPMs requiring
much fewer steps while retaining high sample quality. The inference process of
DDPMs approximates solving the corresponding diffusion ordinary differential
equations (diffusion ODEs) in the continuous limit. This work analyzes how the
backward error affects the diffusion ODEs and the sample quality in DDPMs. We
propose fast sampling through the \textbf{Restricting Backward Error schedule
(RBE schedule)} based on dynamically moderating the long-time backward error.
Our method accelerates DDPMs without any further training. Our experiments show
that sampling with an RBE schedule generates high-quality samples within only 8
to 20 function evaluations on various benchmark datasets. We achieved 12.01 FID
in 8 function evaluations on the ImageNet $128\times128$, and a $20\times$
speedup compared with previous baseline samplers.
Related papers
- UDPM: Upsampling Diffusion Probabilistic Models [33.51145642279836]
Denoising Diffusion Probabilistic Models (DDPM) have recently gained significant attention.
DDPMs generate high-quality samples from complex data distributions by defining an inverse process.
Unlike generative adversarial networks (GANs), the latent space of diffusion models is less interpretable.
In this work, we propose to generalize the denoising diffusion process into an Upsampling Diffusion Probabilistic Model (UDPM)
arXiv Detail & Related papers (2023-05-25T17:25:14Z) - Alleviating Exposure Bias in Diffusion Models through Sampling with Shifted Time Steps [23.144083737873263]
Diffusion Probabilistic Models (DPM) have shown remarkable efficacy in the synthesis of high-quality images.
Previous work has attempted to mitigate this issue by perturbing inputs during training.
We propose a novel sampling method that we propose, without retraining the model.
arXiv Detail & Related papers (2023-05-24T21:39:27Z) - DPM-Solver++: Fast Solver for Guided Sampling of Diffusion Probabilistic
Models [45.612477740555406]
We propose DPM-r++, a high-order solver for guided sampling of DPMs.
We show that DPM-r++ can generate high-quality samples within only 15 to 20 steps for guided sampling by pixel-space and latent-space DPMs.
arXiv Detail & Related papers (2022-11-02T13:14:30Z) - DPM-Solver: A Fast ODE Solver for Diffusion Probabilistic Model Sampling
in Around 10 Steps [45.612477740555406]
Diffusion probabilistic models (DPMs) are emerging powerful generative models.
DPM-r is suitable for both discrete-time and continuous-time DPMs without any further training.
We achieve 4.70 FID in 10 function evaluations and 2.87 FID in 20 function evaluations on the CIFAR10 dataset.
arXiv Detail & Related papers (2022-06-02T08:43:16Z) - Accelerating Diffusion Models via Early Stop of the Diffusion Process [114.48426684994179]
Denoising Diffusion Probabilistic Models (DDPMs) have achieved impressive performance on various generation tasks.
In practice DDPMs often need hundreds even thousands of denoising steps to obtain a high-quality sample.
We propose a principled acceleration strategy, referred to as Early-Stopped DDPM (ES-DDPM), for DDPMs.
arXiv Detail & Related papers (2022-05-25T06:40:09Z) - Pseudo Numerical Methods for Diffusion Models on Manifolds [77.40343577960712]
Denoising Diffusion Probabilistic Models (DDPMs) can generate high-quality samples such as image and audio samples.
DDPMs require hundreds to thousands of iterations to produce final samples.
We propose pseudo numerical methods for diffusion models (PNDMs)
PNDMs can generate higher quality synthetic images with only 50 steps compared with 1000-step DDIMs (20x speedup)
arXiv Detail & Related papers (2022-02-20T10:37:52Z) - Learning to Efficiently Sample from Diffusion Probabilistic Models [49.58748345998702]
Denoising Diffusion Probabilistic Models (DDPMs) can yield high-fidelity samples and competitive log-likelihoods across a range of domains.
We introduce an exact dynamic programming algorithm that finds the optimal discrete time schedules for any pre-trained DDPM.
arXiv Detail & Related papers (2021-06-07T17:15:07Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.