On Accelerating Diffusion-Based Sampling Process via Improved
Integration Approximation
- URL: http://arxiv.org/abs/2304.11328v4
- Date: Tue, 3 Oct 2023 10:35:05 GMT
- Title: On Accelerating Diffusion-Based Sampling Process via Improved
Integration Approximation
- Authors: Guoqiang Zhang, Niwa Kenta, W. Bastiaan Kleijn
- Abstract summary: A popular approach to sample a diffusion-based generative model is to solve an ordinary differential equation (ODE)
We consider accelerating several popular ODE-based sampling processes by optimizing certain coefficients via improved integration approximation (IIA)
We show that considerably better FID scores can be achieved by using IIA-EDM, IIA-DDIM, and IIA-DPM-r than the original counterparts.
- Score: 12.882586878998579
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A popular approach to sample a diffusion-based generative model is to solve
an ordinary differential equation (ODE). In existing samplers, the coefficients
of the ODE solvers are pre-determined by the ODE formulation, the reverse
discrete timesteps, and the employed ODE methods. In this paper, we consider
accelerating several popular ODE-based sampling processes (including EDM, DDIM,
and DPM-Solver) by optimizing certain coefficients via improved integration
approximation (IIA). We propose to minimize, for each time step, a mean squared
error (MSE) function with respect to the selected coefficients. The MSE is
constructed by applying the original ODE solver for a set of fine-grained
timesteps, which in principle provides a more accurate integration
approximation in predicting the next diffusion state. The proposed IIA
technique does not require any change of a pre-trained model, and only
introduces a very small computational overhead for solving a number of
quadratic optimization problems. Extensive experiments show that considerably
better FID scores can be achieved by using IIA-EDM, IIA-DDIM, and
IIA-DPM-Solver than the original counterparts when the neural function
evaluation (NFE) is small (i.e., less than 25).
Related papers
- Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Accelerating Diffusion Sampling with Optimized Time Steps [69.21208434350567]
Diffusion probabilistic models (DPMs) have shown remarkable performance in high-resolution image synthesis.
Their sampling efficiency is still to be desired due to the typically large number of sampling steps.
Recent advancements in high-order numerical ODE solvers for DPMs have enabled the generation of high-quality images with much fewer sampling steps.
arXiv Detail & Related papers (2024-02-27T10:13:30Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Distilling ODE Solvers of Diffusion Models into Smaller Steps [32.49916706943228]
We introduce Distilled-ODE solvers, a straightforward distillation approach grounded in ODE solver formulations.
Our method seamlessly integrates the strengths of both learning-free and learning-based sampling.
Our method incurs negligible computational overhead compared to previous distillation techniques.
arXiv Detail & Related papers (2023-09-28T13:12:18Z) - Improved Order Analysis and Design of Exponential Integrator for
Diffusion Models Sampling [36.50606582918392]
Exponential solvers have gained prominence by demonstrating state-of-the-art performance.
Existing high-order EI-based sampling algorithms rely on degenerate EI solvers.
We propose refined EI solvers that fulfill all the order conditions.
arXiv Detail & Related papers (2023-08-04T06:30:40Z) - AdjointDPM: Adjoint Sensitivity Method for Gradient Backpropagation of Diffusion Probabilistic Models [103.41269503488546]
Existing customization methods require access to multiple reference examples to align pre-trained diffusion probabilistic models with user-provided concepts.
This paper aims to address the challenge of DPM customization when the only available supervision is a differentiable metric defined on the generated contents.
We propose a novel method AdjointDPM, which first generates new samples from diffusion models by solving the corresponding probability-flow ODEs.
It then uses the adjoint sensitivity method to backpropagate the gradients of the loss to the models' parameters.
arXiv Detail & Related papers (2023-07-20T09:06:21Z) - The probability flow ODE is provably fast [43.94655061860487]
We provide the first-time convergence guarantees for the probability flow ODE implementation (together with a corrector step) of score-based generative modeling.
Our analysis is carried out in the wake of recent results obtaining such guarantees for the SDE-based implementation.
arXiv Detail & Related papers (2023-05-19T16:33:05Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - DPM-Solver: A Fast ODE Solver for Diffusion Probabilistic Model Sampling
in Around 10 Steps [45.612477740555406]
Diffusion probabilistic models (DPMs) are emerging powerful generative models.
DPM-r is suitable for both discrete-time and continuous-time DPMs without any further training.
We achieve 4.70 FID in 10 function evaluations and 2.87 FID in 20 function evaluations on the CIFAR10 dataset.
arXiv Detail & Related papers (2022-06-02T08:43:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.