Bilateral Denoising Diffusion Models
- URL: http://arxiv.org/abs/2108.11514v1
- Date: Thu, 26 Aug 2021 13:23:41 GMT
- Title: Bilateral Denoising Diffusion Models
- Authors: Max W. Y. Lam, Jun Wang, Rongjie Huang, Dan Su, Dong Yu
- Abstract summary: Denoising diffusion probabilistic models (DDPMs) have emerged as competitive generative models.
We propose novel bilateral denoising diffusion models (BDDMs) which take significantly fewer steps to generate high-quality samples.
- Score: 34.507876199641665
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Denoising diffusion probabilistic models (DDPMs) have emerged as competitive
generative models yet brought challenges to efficient sampling. In this paper,
we propose novel bilateral denoising diffusion models (BDDMs), which take
significantly fewer steps to generate high-quality samples. From a bilateral
modeling objective, BDDMs parameterize the forward and reverse processes with a
score network and a scheduling network, respectively. We show that a new lower
bound tighter than the standard evidence lower bound can be derived as a
surrogate objective for training the two networks. In particular, BDDMs are
efficient, simple-to-train, and capable of further improving any pre-trained
DDPM by optimizing the inference noise schedules. Our experiments demonstrated
that BDDMs can generate high-fidelity samples with as few as 3 sampling steps
and produce comparable or even higher quality samples than DDPMs using 1000
steps with only 16 sampling steps (a 62x speedup).
Related papers
- Directly Denoising Diffusion Models [6.109141407163027]
We present Directly Denoising Diffusion Model (DDDM), a simple and generic approach for generating realistic images with few-step sampling.
Our model achieves FID scores of 2.57 and 2.33 on CIFAR-10 in one-step and two-step sampling respectively, surpassing those obtained from GANs and distillation-based models.
For ImageNet 64x64, our approach stands as a competitive contender against leading models.
arXiv Detail & Related papers (2024-05-22T11:20:32Z) - Learning Energy-Based Models by Cooperative Diffusion Recovery Likelihood [64.95663299945171]
Training energy-based models (EBMs) on high-dimensional data can be both challenging and time-consuming.
There exists a noticeable gap in sample quality between EBMs and other generative frameworks like GANs and diffusion models.
We propose cooperative diffusion recovery likelihood (CDRL), an effective approach to tractably learn and sample from a series of EBMs.
arXiv Detail & Related papers (2023-09-10T22:05:24Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Parallel Sampling of Diffusion Models [76.3124029406809]
Diffusion models are powerful generative models but suffer from slow sampling.
We present ParaDiGMS, a novel method to accelerate the sampling of pretrained diffusion models by denoising multiple steps in parallel.
arXiv Detail & Related papers (2023-05-25T17:59:42Z) - Fast Inference in Denoising Diffusion Models via MMD Finetuning [23.779985842891705]
We present MMD-DDM, a novel method for fast sampling of diffusion models.
Our approach is based on the idea of using the Maximum Mean Discrepancy (MMD) to finetune the learned distribution with a given budget of timesteps.
Our findings show that the proposed method is able to produce high-quality samples in a fraction of the time required by widely-used diffusion models.
arXiv Detail & Related papers (2023-01-19T09:48:07Z) - Accelerating Diffusion Models via Early Stop of the Diffusion Process [114.48426684994179]
Denoising Diffusion Probabilistic Models (DDPMs) have achieved impressive performance on various generation tasks.
In practice DDPMs often need hundreds even thousands of denoising steps to obtain a high-quality sample.
We propose a principled acceleration strategy, referred to as Early-Stopped DDPM (ES-DDPM), for DDPMs.
arXiv Detail & Related papers (2022-05-25T06:40:09Z) - BDDM: Bilateral Denoising Diffusion Models for Fast and High-Quality
Speech Synthesis [45.58131296169655]
Diffusion probabilistic models (DPMs) and their extensions have emerged as competitive generative models yet confront challenges of efficient sampling.
We propose a new bilateral denoising diffusion model that parameterizes both the forward and reverse processes with a schedule network and a score network.
We show that the new surrogate objective can achieve a lower bound of the log marginal likelihood tighter than a conventional surrogate.
arXiv Detail & Related papers (2022-03-25T08:53:12Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.