Analytic-DPM: an Analytic Estimate of the Optimal Reverse Variance in
Diffusion Probabilistic Models
- URL: http://arxiv.org/abs/2201.06503v1
- Date: Mon, 17 Jan 2022 16:28:12 GMT
- Title: Analytic-DPM: an Analytic Estimate of the Optimal Reverse Variance in
Diffusion Probabilistic Models
- Authors: Fan Bao, Chongxuan Li, Jun Zhu, Bo Zhang
- Abstract summary: Diffusion probabilistic models (DPMs) represent a class of powerful generative models.
We propose Analytic-DPM, a training-free inference framework that estimates the analytic forms of the variance and KL divergence.
We derive both lower and upper bounds of the optimal variance and clip the estimate for a better result.
- Score: 39.11468968340014
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Diffusion probabilistic models (DPMs) represent a class of powerful
generative models. Despite their success, the inference of DPMs is expensive
since it generally needs to iterate over thousands of timesteps. A key problem
in the inference is to estimate the variance in each timestep of the reverse
process. In this work, we present a surprising result that both the optimal
reverse variance and the corresponding optimal KL divergence of a DPM have
analytic forms w.r.t. its score function. Building upon it, we propose
Analytic-DPM, a training-free inference framework that estimates the analytic
forms of the variance and KL divergence using the Monte Carlo method and a
pretrained score-based model. Further, to correct the potential bias caused by
the score-based model, we derive both lower and upper bounds of the optimal
variance and clip the estimate for a better result. Empirically, our
analytic-DPM improves the log-likelihood of various DPMs, produces high-quality
samples, and meanwhile enjoys a 20x to 80x speed up.
Related papers
- Contractive Diffusion Probabilistic Models [5.217870815854702]
Diffusion probabilistic models (DPMs) have emerged as a promising technique in generative modeling.
We propose a new criterion -- the contraction property of backward sampling in the design of DPMs, leading to a novel class of contractive DPMs (CDPMs)
We show that CDPM can leverage weights of pretrained DPMs by a simple transformation, and does not need retraining.
arXiv Detail & Related papers (2024-01-23T21:51:51Z) - DPM-OT: A New Diffusion Probabilistic Model Based on Optimal Transport [26.713392774427653]
DPM-OT is a unified learning framework for fast DPMs with a direct expressway represented by OT map.
It can generate high-quality samples within around 10 function evaluations.
Experiments validate the effectiveness and advantages of DPM-OT in terms of speed and quality.
arXiv Detail & Related papers (2023-07-21T02:28:54Z) - Towards Faster Non-Asymptotic Convergence for Diffusion-Based Generative
Models [49.81937966106691]
We develop a suite of non-asymptotic theory towards understanding the data generation process of diffusion models.
In contrast to prior works, our theory is developed based on an elementary yet versatile non-asymptotic approach.
arXiv Detail & Related papers (2023-06-15T16:30:08Z) - On Calibrating Diffusion Probabilistic Models [78.75538484265292]
diffusion probabilistic models (DPMs) have achieved promising results in diverse generative tasks.
We propose a simple way for calibrating an arbitrary pretrained DPM, with which the score matching loss can be reduced and the lower bounds of model likelihood can be increased.
Our calibration method is performed only once and the resulting models can be used repeatedly for sampling.
arXiv Detail & Related papers (2023-02-21T14:14:40Z) - Estimating the Optimal Covariance with Imperfect Mean in Diffusion
Probabilistic Models [37.18522296366212]
Diffusion probabilistic models (DPMs) are a class of powerful deep generative models (DGMs)
Despite their success, the iterative generation process over the full timesteps is much less efficient than other DGMs such as GANs.
We consider diagonal and full covariances to improve the expressive power of DPMs.
arXiv Detail & Related papers (2022-06-15T05:42:48Z) - How Much is Enough? A Study on Diffusion Times in Score-based Generative
Models [76.76860707897413]
Current best practice advocates for a large T to ensure that the forward dynamics brings the diffusion sufficiently close to a known and simple noise distribution.
We show how an auxiliary model can be used to bridge the gap between the ideal and the simulated forward dynamics, followed by a standard reverse diffusion process.
arXiv Detail & Related papers (2022-06-10T15:09:46Z) - Pseudo Numerical Methods for Diffusion Models on Manifolds [77.40343577960712]
Denoising Diffusion Probabilistic Models (DDPMs) can generate high-quality samples such as image and audio samples.
DDPMs require hundreds to thousands of iterations to produce final samples.
We propose pseudo numerical methods for diffusion models (PNDMs)
PNDMs can generate higher quality synthetic images with only 50 steps compared with 1000-step DDIMs (20x speedup)
arXiv Detail & Related papers (2022-02-20T10:37:52Z) - Decision-Making with Auto-Encoding Variational Bayes [71.44735417472043]
We show that a posterior approximation distinct from the variational distribution should be used for making decisions.
Motivated by these theoretical results, we propose learning several approximate proposals for the best model.
In addition to toy examples, we present a full-fledged case study of single-cell RNA sequencing.
arXiv Detail & Related papers (2020-02-17T19:23:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.