Accelerating Guided Diffusion Sampling with Splitting Numerical Methods
- URL: http://arxiv.org/abs/2301.11558v1
- Date: Fri, 27 Jan 2023 06:48:29 GMT
- Title: Accelerating Guided Diffusion Sampling with Splitting Numerical Methods
- Authors: Suttisak Wizadwongsa, Supasorn Suwajanakorn
- Abstract summary: Recent techniques can accelerate unguided sampling by applying high-order numerical methods to the sampling process.
This paper explores the culprit of this problem and provides a solution based on operator splitting methods.
Our proposed method can re-utilize the high-order methods for guided sampling and can generate images with the same quality as a 250-step DDIM baseline.
- Score: 8.689906452450938
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Guided diffusion is a technique for conditioning the output of a diffusion
model at sampling time without retraining the network for each specific task.
One drawback of diffusion models, however, is their slow sampling process.
Recent techniques can accelerate unguided sampling by applying high-order
numerical methods to the sampling process when viewed as differential
equations. On the contrary, we discover that the same techniques do not work
for guided sampling, and little has been explored about its acceleration. This
paper explores the culprit of this problem and provides a solution based on
operator splitting methods, motivated by our key finding that classical
high-order numerical methods are unsuitable for the conditional function. Our
proposed method can re-utilize the high-order methods for guided sampling and
can generate images with the same quality as a 250-step DDIM baseline using
32-58% less sampling time on ImageNet256. We also demonstrate usage on a wide
variety of conditional generation tasks, such as text-to-image generation,
colorization, inpainting, and super-resolution.
Related papers
- Fast constrained sampling in pre-trained diffusion models [77.21486516041391]
Diffusion models have dominated the field of large, generative image models.
We propose an algorithm for fast-constrained sampling in large pre-trained diffusion models.
arXiv Detail & Related papers (2024-10-24T14:52:38Z) - A Simple Early Exiting Framework for Accelerated Sampling in Diffusion Models [14.859580045688487]
A practical bottleneck of diffusion models is their sampling speed.
We propose a novel framework capable of adaptively allocating compute required for the score estimation.
We show that our method could significantly improve the sampling throughput of the diffusion models without compromising image quality.
arXiv Detail & Related papers (2024-08-12T05:33:45Z) - Fast Samplers for Inverse Problems in Iterative Refinement Models [19.099632445326826]
We propose a plug-and-play framework for constructing efficient samplers for inverse problems.
Our method can generate high-quality samples in as few as 5 conditional sampling steps and outperforms competing baselines requiring 20-1000 steps.
arXiv Detail & Related papers (2024-05-27T21:50:16Z) - Accelerating Parallel Sampling of Diffusion Models [25.347710690711562]
We propose a novel approach that accelerates the sampling of diffusion models by parallelizing the autoregressive process.
Applying these techniques, we introduce ParaTAA, a universal and training-free parallel sampling algorithm.
Our experiments demonstrate that ParaTAA can decrease the inference steps required by common sequential sampling algorithms by a factor of 4$sim$14 times.
arXiv Detail & Related papers (2024-02-15T14:27:58Z) - Parallel Sampling of Diffusion Models [76.3124029406809]
Diffusion models are powerful generative models but suffer from slow sampling.
We present ParaDiGMS, a novel method to accelerate the sampling of pretrained diffusion models by denoising multiple steps in parallel.
arXiv Detail & Related papers (2023-05-25T17:59:42Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Pseudo Numerical Methods for Diffusion Models on Manifolds [77.40343577960712]
Denoising Diffusion Probabilistic Models (DDPMs) can generate high-quality samples such as image and audio samples.
DDPMs require hundreds to thousands of iterations to produce final samples.
We propose pseudo numerical methods for diffusion models (PNDMs)
PNDMs can generate higher quality synthetic images with only 50 steps compared with 1000-step DDIMs (20x speedup)
arXiv Detail & Related papers (2022-02-20T10:37:52Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.