Accelerating Diffusion Models via Early Stop of the Diffusion Process
- URL: http://arxiv.org/abs/2205.12524v1
- Date: Wed, 25 May 2022 06:40:09 GMT
- Title: Accelerating Diffusion Models via Early Stop of the Diffusion Process
- Authors: Zhaoyang Lyu, Xudong XU, Ceyuan Yang, Dahua Lin, Bo Dai
- Abstract summary: Denoising Diffusion Probabilistic Models (DDPMs) have achieved impressive performance on various generation tasks.
In practice DDPMs often need hundreds even thousands of denoising steps to obtain a high-quality sample.
We propose a principled acceleration strategy, referred to as Early-Stopped DDPM (ES-DDPM), for DDPMs.
- Score: 114.48426684994179
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Denoising Diffusion Probabilistic Models (DDPMs) have achieved impressive
performance on various generation tasks. By modeling the reverse process of
gradually diffusing the data distribution into a Gaussian distribution,
generating a sample in DDPMs can be regarded as iteratively denoising a
randomly sampled Gaussian noise. However, in practice DDPMs often need hundreds
even thousands of denoising steps to obtain a high-quality sample from the
Gaussian noise, leading to extremely low inference efficiency. In this work, we
propose a principled acceleration strategy, referred to as Early-Stopped DDPM
(ES-DDPM), for DDPMs. The key idea is to stop the diffusion process early where
only the few initial diffusing steps are considered and the reverse denoising
process starts from a non-Gaussian distribution. By further adopting a powerful
pre-trained generative model, such as GAN and VAE, in ES-DDPM, sampling from
the target non-Gaussian distribution can be efficiently achieved by diffusing
samples obtained from the pre-trained generative model. In this way, the number
of required denoising steps is significantly reduced. In the meantime, the
sample quality of ES-DDPM also improves substantially, outperforming both the
vanilla DDPM and the adopted pre-trained generative model. On extensive
experiments across CIFAR-10, CelebA, ImageNet, LSUN-Bedroom and LSUN-Cat,
ES-DDPM obtains promising acceleration effect and performance improvement over
representative baseline methods. Moreover, ES-DDPM also demonstrates several
attractive properties, including being orthogonal to existing acceleration
methods, as well as simultaneously enabling both global semantic and local
pixel-level control in image generation.
Related papers
- Conditional GAN for Enhancing Diffusion Models in Efficient and Authentic Global Gesture Generation from Audios [10.57695963534794]
Methods based on VAEs are accompanied by issues of local jitter and global instability.
We introduce a conditional GAN to capture audio control signals and implicitly match the multimodal denoising distribution between the diffusion and denoising steps.
arXiv Detail & Related papers (2024-10-27T07:25:11Z) - EM Distillation for One-step Diffusion Models [65.57766773137068]
We propose a maximum likelihood-based approach that distills a diffusion model to a one-step generator model with minimal loss of quality.
We develop a reparametrized sampling scheme and a noise cancellation technique that together stabilizes the distillation process.
arXiv Detail & Related papers (2024-05-27T05:55:22Z) - Boosting Diffusion Models with Moving Average Sampling in Frequency Domain [101.43824674873508]
Diffusion models rely on the current sample to denoise the next one, possibly resulting in denoising instability.
In this paper, we reinterpret the iterative denoising process as model optimization and leverage a moving average mechanism to ensemble all the prior samples.
We name the complete approach "Moving Average Sampling in Frequency domain (MASF)"
arXiv Detail & Related papers (2024-03-26T16:57:55Z) - DPM-OT: A New Diffusion Probabilistic Model Based on Optimal Transport [26.713392774427653]
DPM-OT is a unified learning framework for fast DPMs with a direct expressway represented by OT map.
It can generate high-quality samples within around 10 function evaluations.
Experiments validate the effectiveness and advantages of DPM-OT in terms of speed and quality.
arXiv Detail & Related papers (2023-07-21T02:28:54Z) - AdjointDPM: Adjoint Sensitivity Method for Gradient Backpropagation of Diffusion Probabilistic Models [103.41269503488546]
Existing customization methods require access to multiple reference examples to align pre-trained diffusion probabilistic models with user-provided concepts.
This paper aims to address the challenge of DPM customization when the only available supervision is a differentiable metric defined on the generated contents.
We propose a novel method AdjointDPM, which first generates new samples from diffusion models by solving the corresponding probability-flow ODEs.
It then uses the adjoint sensitivity method to backpropagate the gradients of the loss to the models' parameters.
arXiv Detail & Related papers (2023-07-20T09:06:21Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - UDPM: Upsampling Diffusion Probabilistic Models [33.51145642279836]
Denoising Diffusion Probabilistic Models (DDPM) have recently gained significant attention.
DDPMs generate high-quality samples from complex data distributions by defining an inverse process.
Unlike generative adversarial networks (GANs), the latent space of diffusion models is less interpretable.
In this work, we propose to generalize the denoising diffusion process into an Upsampling Diffusion Probabilistic Model (UDPM)
arXiv Detail & Related papers (2023-05-25T17:25:14Z) - Fast Diffusion Probabilistic Model Sampling through the lens of Backward
Error Analysis [26.907301901503835]
Denoising diffusion probabilistic models (DDPMs) are a class of powerful generative models.
DDPMs generally need hundreds or thousands of sequential function evaluations (steps) of neural networks to generate a sample.
This paper aims to develop a fast sampling method for DDPMs requiring much fewer steps while retaining high sample quality.
arXiv Detail & Related papers (2023-04-22T16:58:47Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.