Semi-Implicit Denoising Diffusion Models (SIDDMs)
- URL: http://arxiv.org/abs/2306.12511v3
- Date: Tue, 10 Oct 2023 20:27:25 GMT
- Title: Semi-Implicit Denoising Diffusion Models (SIDDMs)
- Authors: Yanwu Xu, Mingming Gong, Shaoan Xie, Wei Wei, Matthias Grundmann,
Kayhan Batmanghelich, Tingbo Hou
- Abstract summary: Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
- Score: 50.30163684539586
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite the proliferation of generative models, achieving fast sampling
during inference without compromising sample diversity and quality remains
challenging. Existing models such as Denoising Diffusion Probabilistic Models
(DDPM) deliver high-quality, diverse samples but are slowed by an inherently
high number of iterative steps. The Denoising Diffusion Generative Adversarial
Networks (DDGAN) attempted to circumvent this limitation by integrating a GAN
model for larger jumps in the diffusion process. However, DDGAN encountered
scalability limitations when applied to large datasets. To address these
limitations, we introduce a novel approach that tackles the problem by matching
implicit and explicit factors. More specifically, our approach involves
utilizing an implicit model to match the marginal distributions of noisy data
and the explicit conditional distribution of the forward diffusion. This
combination allows us to effectively match the joint denoising distributions.
Unlike DDPM but similar to DDGAN, we do not enforce a parametric distribution
for the reverse step, enabling us to take large steps during inference. Similar
to the DDPM but unlike DDGAN, we take advantage of the exact form of the
diffusion process. We demonstrate that our proposed method obtains comparable
generative performance to diffusion-based models and vastly superior results to
models with a small number of sampling steps.
Related papers
- Non-asymptotic Convergence of Discrete-time Diffusion Models: New Approach and Improved Rate [49.97755400231656]
We establish convergence guarantees for substantially larger classes of distributions under DT diffusion processes.
We then specialize our results to a number of interesting classes of distributions with explicit parameter dependencies.
We propose a novel accelerated sampler and show that it improves the convergence rates of the corresponding regular sampler by orders of magnitude with respect to all system parameters.
arXiv Detail & Related papers (2024-02-21T16:11:47Z) - Fast Sampling via Discrete Non-Markov Diffusion Models [49.598085130313514]
We propose a discrete non-Markov diffusion model, which admits an accelerated reverse sampling for discrete data generation.
Our method significantly reduces the number of function evaluations (i.e., calls to the neural network), making the sampling process much faster.
arXiv Detail & Related papers (2023-12-14T18:14:11Z) - Discrete Diffusion Modeling by Estimating the Ratios of the Data Distribution [67.9215891673174]
We propose score entropy as a novel loss that naturally extends score matching to discrete spaces.
We test our Score Entropy Discrete Diffusion models on standard language modeling tasks.
arXiv Detail & Related papers (2023-10-25T17:59:12Z) - Soft Mixture Denoising: Beyond the Expressive Bottleneck of Diffusion
Models [76.46246743508651]
We show that current diffusion models actually have an expressive bottleneck in backward denoising.
We introduce soft mixture denoising (SMD), an expressive and efficient model for backward denoising.
arXiv Detail & Related papers (2023-09-25T12:03:32Z) - UDPM: Upsampling Diffusion Probabilistic Models [33.51145642279836]
Denoising Diffusion Probabilistic Models (DDPM) have recently gained significant attention.
DDPMs generate high-quality samples from complex data distributions by defining an inverse process.
Unlike generative adversarial networks (GANs), the latent space of diffusion models is less interpretable.
In this work, we propose to generalize the denoising diffusion process into an Upsampling Diffusion Probabilistic Model (UDPM)
arXiv Detail & Related papers (2023-05-25T17:25:14Z) - Denoising Diffusion Samplers [41.796349001299156]
Denoising diffusion models are a popular class of generative models providing state-of-the-art results in many domains.
We explore a similar idea to sample approximately from unnormalized probability density functions and estimate their normalizing constants.
While score matching is not applicable in this context, we can leverage many of the ideas introduced in generative modeling for Monte Carlo sampling.
arXiv Detail & Related papers (2023-02-27T14:37:16Z) - Fast Inference in Denoising Diffusion Models via MMD Finetuning [23.779985842891705]
We present MMD-DDM, a novel method for fast sampling of diffusion models.
Our approach is based on the idea of using the Maximum Mean Discrepancy (MMD) to finetune the learned distribution with a given budget of timesteps.
Our findings show that the proposed method is able to produce high-quality samples in a fraction of the time required by widely-used diffusion models.
arXiv Detail & Related papers (2023-01-19T09:48:07Z) - Tackling the Generative Learning Trilemma with Denoising Diffusion GANs [20.969702008187838]
Deep generative models often struggle with simultaneously addressing high sample quality, mode coverage, and fast sampling.
We call the challenge the generative learning trilemma, as the existing models often trade some of them for others.
We introduce denoising diffusion generative adversarial networks (denoising diffusion GANs) that model each denoising step using a multimodal conditional GAN.
arXiv Detail & Related papers (2021-12-15T00:09:38Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.