Contractive Diffusion Probabilistic Models
- URL: http://arxiv.org/abs/2401.13115v3
- Date: Sat, 12 Oct 2024 03:31:16 GMT
- Title: Contractive Diffusion Probabilistic Models
- Authors: Wenpin Tang, Hanyang Zhao,
- Abstract summary: Diffusion probabilistic models (DPMs) have emerged as a promising technique in generative modeling.
We propose a new criterion -- the contraction property of backward sampling in the design of DPMs, leading to a novel class of contractive DPMs (CDPMs)
We show that CDPM can leverage weights of pretrained DPMs by a simple transformation, and does not need retraining.
- Score: 5.217870815854702
- License:
- Abstract: Diffusion probabilistic models (DPMs) have emerged as a promising technique in generative modeling. The success of DPMs relies on two ingredients: time reversal of diffusion processes and score matching. In view of possibly unguaranteed score matching, we propose a new criterion -- the contraction property of backward sampling in the design of DPMs, leading to a novel class of contractive DPMs (CDPMs). Our key insight is that, the contraction property can provably narrow score matching errors and discretization errors, thus our proposed CDPMs are robust to both sources of error. For practical use, we show that CDPM can leverage weights of pretrained DPMs by a simple transformation, and does not need retraining. We corroborated our approach by experiments on synthetic 1-dim examples, Swiss Roll, MNIST, CIFAR-10 32$\times$32 and AFHQ 64$\times$64 dataset. Notably, CDPM steadily improves the performance of baseline score-based diffusion models.
Related papers
- DC-Solver: Improving Predictor-Corrector Diffusion Sampler via Dynamic Compensation [68.55191764622525]
Diffusion models (DPMs) have shown remarkable performance in visual synthesis but are computationally expensive due to the need for multiple evaluations during the sampling.
Recent predictor synthesis-or diffusion samplers have significantly reduced the required number of evaluations, but inherently suffer from a misalignment issue.
We introduce a new fast DPM sampler called DC-CPRr, which leverages dynamic compensation to mitigate the misalignment.
arXiv Detail & Related papers (2024-09-05T17:59:46Z) - Boosting Diffusion Models with an Adaptive Momentum Sampler [21.88226514633627]
We present a novel reverse sampler for DPMs inspired by the widely-used Adam sampler.
Our proposed sampler can be readily applied to a pre-trained diffusion model.
By implicitly reusing update directions from early steps, our proposed sampler achieves a better balance between high-level semantics and low-level details.
arXiv Detail & Related papers (2023-08-23T06:22:02Z) - DPM-OT: A New Diffusion Probabilistic Model Based on Optimal Transport [26.713392774427653]
DPM-OT is a unified learning framework for fast DPMs with a direct expressway represented by OT map.
It can generate high-quality samples within around 10 function evaluations.
Experiments validate the effectiveness and advantages of DPM-OT in terms of speed and quality.
arXiv Detail & Related papers (2023-07-21T02:28:54Z) - AdjointDPM: Adjoint Sensitivity Method for Gradient Backpropagation of Diffusion Probabilistic Models [103.41269503488546]
Existing customization methods require access to multiple reference examples to align pre-trained diffusion probabilistic models with user-provided concepts.
This paper aims to address the challenge of DPM customization when the only available supervision is a differentiable metric defined on the generated contents.
We propose a novel method AdjointDPM, which first generates new samples from diffusion models by solving the corresponding probability-flow ODEs.
It then uses the adjoint sensitivity method to backpropagate the gradients of the loss to the models' parameters.
arXiv Detail & Related papers (2023-07-20T09:06:21Z) - Alleviating Exposure Bias in Diffusion Models through Sampling with Shifted Time Steps [23.144083737873263]
Diffusion Probabilistic Models (DPM) have shown remarkable efficacy in the synthesis of high-quality images.
Previous work has attempted to mitigate this issue by perturbing inputs during training.
We propose a novel sampling method that we propose, without retraining the model.
arXiv Detail & Related papers (2023-05-24T21:39:27Z) - On Calibrating Diffusion Probabilistic Models [78.75538484265292]
diffusion probabilistic models (DPMs) have achieved promising results in diverse generative tasks.
We propose a simple way for calibrating an arbitrary pretrained DPM, with which the score matching loss can be reduced and the lower bounds of model likelihood can be increased.
Our calibration method is performed only once and the resulting models can be used repeatedly for sampling.
arXiv Detail & Related papers (2023-02-21T14:14:40Z) - UniPC: A Unified Predictor-Corrector Framework for Fast Sampling of
Diffusion Models [92.43617471204963]
Diffusion probabilistic models (DPMs) have demonstrated a very promising ability in high-resolution image synthesis.
We develop a unified corrector (UniC) that can be applied after any existing DPM sampler to increase the order of accuracy.
We propose a unified predictor-corrector framework called UniPC for the fast sampling of DPMs.
arXiv Detail & Related papers (2023-02-09T18:59:48Z) - Robust Face Anti-Spoofing with Dual Probabilistic Modeling [49.14353429234298]
We propose a unified framework called Dual Probabilistic Modeling (DPM), with two dedicated modules, DPM-LQ (Label Quality aware learning) and DPM-DQ (Data Quality aware learning)
DPM-LQ is able to produce robust feature representations without overfitting to the distribution of noisy semantic labels.
DPM-DQ can eliminate data noise from False Reject' and False Accept' during inference by correcting the prediction confidence of noisy data based on its quality distribution.
arXiv Detail & Related papers (2022-04-27T03:44:18Z) - Analytic-DPM: an Analytic Estimate of the Optimal Reverse Variance in
Diffusion Probabilistic Models [39.11468968340014]
Diffusion probabilistic models (DPMs) represent a class of powerful generative models.
We propose Analytic-DPM, a training-free inference framework that estimates the analytic forms of the variance and KL divergence.
We derive both lower and upper bounds of the optimal variance and clip the estimate for a better result.
arXiv Detail & Related papers (2022-01-17T16:28:12Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.