Pseudo Numerical Methods for Diffusion Models on Manifolds
- URL: http://arxiv.org/abs/2202.09778v1
- Date: Sun, 20 Feb 2022 10:37:52 GMT
- Title: Pseudo Numerical Methods for Diffusion Models on Manifolds
- Authors: Luping Liu, Yi Ren, Zhijie Lin, Zhou Zhao
- Abstract summary: Denoising Diffusion Probabilistic Models (DDPMs) can generate high-quality samples such as image and audio samples.
DDPMs require hundreds to thousands of iterations to produce final samples.
We propose pseudo numerical methods for diffusion models (PNDMs)
PNDMs can generate higher quality synthetic images with only 50 steps compared with 1000-step DDIMs (20x speedup)
- Score: 77.40343577960712
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Denoising Diffusion Probabilistic Models (DDPMs) can generate high-quality
samples such as image and audio samples. However, DDPMs require hundreds to
thousands of iterations to produce final samples. Several prior works have
successfully accelerated DDPMs through adjusting the variance schedule (e.g.,
Improved Denoising Diffusion Probabilistic Models) or the denoising equation
(e.g., Denoising Diffusion Implicit Models (DDIMs)). However, these
acceleration methods cannot maintain the quality of samples and even introduce
new noise at a high speedup rate, which limit their practicability. To
accelerate the inference process while keeping the sample quality, we provide a
fresh perspective that DDPMs should be treated as solving differential
equations on manifolds. Under such a perspective, we propose pseudo numerical
methods for diffusion models (PNDMs). Specifically, we figure out how to solve
differential equations on manifolds and show that DDIMs are simple cases of
pseudo numerical methods. We change several classical numerical methods to
corresponding pseudo numerical methods and find that the pseudo linear
multi-step method is the best in most situations. According to our experiments,
by directly using pre-trained models on Cifar10, CelebA and LSUN, PNDMs can
generate higher quality synthetic images with only 50 steps compared with
1000-step DDIMs (20x speedup), significantly outperform DDIMs with 250 steps
(by around 0.4 in FID) and have good generalization on different variance
schedules. Our implementation is available at
https://github.com/luping-liu/PNDM.
Related papers
- Parallel Sampling of Diffusion Models [76.3124029406809]
Diffusion models are powerful generative models but suffer from slow sampling.
We present ParaDiGMS, a novel method to accelerate the sampling of pretrained diffusion models by denoising multiple steps in parallel.
arXiv Detail & Related papers (2023-05-25T17:59:42Z) - UDPM: Upsampling Diffusion Probabilistic Models [33.51145642279836]
Denoising Diffusion Probabilistic Models (DDPM) have recently gained significant attention.
DDPMs generate high-quality samples from complex data distributions by defining an inverse process.
Unlike generative adversarial networks (GANs), the latent space of diffusion models is less interpretable.
In this work, we propose to generalize the denoising diffusion process into an Upsampling Diffusion Probabilistic Model (UDPM)
arXiv Detail & Related papers (2023-05-25T17:25:14Z) - Alleviating Exposure Bias in Diffusion Models through Sampling with Shifted Time Steps [23.144083737873263]
Diffusion Probabilistic Models (DPM) have shown remarkable efficacy in the synthesis of high-quality images.
Previous work has attempted to mitigate this issue by perturbing inputs during training.
We propose a novel sampling method that we propose, without retraining the model.
arXiv Detail & Related papers (2023-05-24T21:39:27Z) - Fast Diffusion Probabilistic Model Sampling through the lens of Backward
Error Analysis [26.907301901503835]
Denoising diffusion probabilistic models (DDPMs) are a class of powerful generative models.
DDPMs generally need hundreds or thousands of sequential function evaluations (steps) of neural networks to generate a sample.
This paper aims to develop a fast sampling method for DDPMs requiring much fewer steps while retaining high sample quality.
arXiv Detail & Related papers (2023-04-22T16:58:47Z) - On Calibrating Diffusion Probabilistic Models [78.75538484265292]
diffusion probabilistic models (DPMs) have achieved promising results in diverse generative tasks.
We propose a simple way for calibrating an arbitrary pretrained DPM, with which the score matching loss can be reduced and the lower bounds of model likelihood can be increased.
Our calibration method is performed only once and the resulting models can be used repeatedly for sampling.
arXiv Detail & Related papers (2023-02-21T14:14:40Z) - Accelerating Diffusion Models via Early Stop of the Diffusion Process [114.48426684994179]
Denoising Diffusion Probabilistic Models (DDPMs) have achieved impressive performance on various generation tasks.
In practice DDPMs often need hundreds even thousands of denoising steps to obtain a high-quality sample.
We propose a principled acceleration strategy, referred to as Early-Stopped DDPM (ES-DDPM), for DDPMs.
arXiv Detail & Related papers (2022-05-25T06:40:09Z) - Learning to Efficiently Sample from Diffusion Probabilistic Models [49.58748345998702]
Denoising Diffusion Probabilistic Models (DDPMs) can yield high-fidelity samples and competitive log-likelihoods across a range of domains.
We introduce an exact dynamic programming algorithm that finds the optimal discrete time schedules for any pre-trained DDPM.
arXiv Detail & Related papers (2021-06-07T17:15:07Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.