It\^{o}-Taylor Sampling Scheme for Denoising Diffusion Probabilistic
Models using Ideal Derivatives
- URL: http://arxiv.org/abs/2112.13339v1
- Date: Sun, 26 Dec 2021 09:38:11 GMT
- Title: It\^{o}-Taylor Sampling Scheme for Denoising Diffusion Probabilistic
Models using Ideal Derivatives
- Authors: Hideyuki Tachibana, Mocho Go, Muneyoshi Inahara, Yotaro Katayama,
Yotaro Watanabe
- Abstract summary: This paper proposes a new DDPM sampler based on a second-order numerical scheme for differential equations (SDEs)
It is experimentally observed that the proposed sampler could synthesize plausible images and audio signals in relatively smaller number of refinement steps.
- Score: 5.302303646066551
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Denoising Diffusion Probabilistic Models (DDPMs) have been attracting
attention recently as a new challenger to popular deep neural generative models
including GAN, VAE, etc. However, DDPMs have a disadvantage that they often
require a huge number of refinement steps during the synthesis. To address this
problem, this paper proposes a new DDPM sampler based on a second-order
numerical scheme for stochastic differential equations (SDEs), while the
conventional sampler is based on a first-order numerical scheme. In general, it
is not easy to compute the derivatives that are required in higher-order
numerical schemes. However, in the case of DDPM, this difficulty is alleviated
by the trick which the authors call "ideal derivative substitution". The newly
derived higher-order sampler was applied to both image and speech generation
tasks, and it is experimentally observed that the proposed sampler could
synthesize plausible images and audio signals in relatively smaller number of
refinement steps.
Related papers
- Sparse Inducing Points in Deep Gaussian Processes: Enhancing Modeling with Denoising Diffusion Variational Inference [6.37512592611305]
In deep Gaussian processes (DGPs) a sparse integration location called inducing points are selected to approximate the posterior distribution of the model.
Traditional variational inference approaches to posterior approximation often lead to significant bias.
We propose an alternative method called Denoising Diffusion Variational Inference (DDVI) that uses a denoising diffusion differential equation (SDE) to generate posterior samples of inducing variables.
arXiv Detail & Related papers (2024-07-24T06:39:58Z) - Score-based Generative Models with Adaptive Momentum [40.84399531998246]
We propose an adaptive momentum sampling method to accelerate the transforming process.
We show that our method can produce more faithful images/graphs in small sampling steps with 2 to 5 times speed up.
arXiv Detail & Related papers (2024-05-22T15:20:27Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - AdjointDPM: Adjoint Sensitivity Method for Gradient Backpropagation of Diffusion Probabilistic Models [103.41269503488546]
Existing customization methods require access to multiple reference examples to align pre-trained diffusion probabilistic models with user-provided concepts.
This paper aims to address the challenge of DPM customization when the only available supervision is a differentiable metric defined on the generated contents.
We propose a novel method AdjointDPM, which first generates new samples from diffusion models by solving the corresponding probability-flow ODEs.
It then uses the adjoint sensitivity method to backpropagate the gradients of the loss to the models' parameters.
arXiv Detail & Related papers (2023-07-20T09:06:21Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - Denoising Diffusion Samplers [41.796349001299156]
Denoising diffusion models are a popular class of generative models providing state-of-the-art results in many domains.
We explore a similar idea to sample approximately from unnormalized probability density functions and estimate their normalizing constants.
While score matching is not applicable in this context, we can leverage many of the ideas introduced in generative modeling for Monte Carlo sampling.
arXiv Detail & Related papers (2023-02-27T14:37:16Z) - Pseudo Numerical Methods for Diffusion Models on Manifolds [77.40343577960712]
Denoising Diffusion Probabilistic Models (DDPMs) can generate high-quality samples such as image and audio samples.
DDPMs require hundreds to thousands of iterations to produce final samples.
We propose pseudo numerical methods for diffusion models (PNDMs)
PNDMs can generate higher quality synthetic images with only 50 steps compared with 1000-step DDIMs (20x speedup)
arXiv Detail & Related papers (2022-02-20T10:37:52Z) - Estimating High Order Gradients of the Data Distribution by Denoising [81.24581325617552]
First order derivative of a data density can be estimated efficiently by denoising score matching.
We propose a method to directly estimate high order derivatives (scores) of a data density from samples.
arXiv Detail & Related papers (2021-11-08T18:59:23Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.