Diffusion Models with Deterministic Normalizing Flow Priors
- URL: http://arxiv.org/abs/2309.01274v1
- Date: Sun, 3 Sep 2023 21:26:56 GMT
- Title: Diffusion Models with Deterministic Normalizing Flow Priors
- Authors: Mohsen Zand, Ali Etemad, Michael Greenspan
- Abstract summary: We propose DiNof ($textbfDi$ffusion with $textbfNo$rmalizing $textbff$low priors), a technique that makes use of normalizing flows and diffusion models.
Experiments on standard image generation datasets demonstrate the advantage of the proposed method over existing approaches.
- Score: 23.212848643552395
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: For faster sampling and higher sample quality, we propose DiNof
($\textbf{Di}$ffusion with $\textbf{No}$rmalizing $\textbf{f}$low priors), a
technique that makes use of normalizing flows and diffusion models. We use
normalizing flows to parameterize the noisy data at any arbitrary step of the
diffusion process and utilize it as the prior in the reverse diffusion process.
More specifically, the forward noising process turns a data distribution into
partially noisy data, which are subsequently transformed into a Gaussian
distribution by a nonlinear process. The backward denoising procedure begins
with a prior created by sampling from the Gaussian distribution and applying
the invertible normalizing flow transformations deterministically. To generate
the data distribution, the prior then undergoes the remaining diffusion
stochastic denoising procedure. Through the reduction of the number of total
diffusion steps, we are able to speed up both the forward and backward
processes. More importantly, we improve the expressive power of diffusion
models by employing both deterministic and stochastic mappings. Experiments on
standard image generation datasets demonstrate the advantage of the proposed
method over existing approaches. On the unconditional CIFAR10 dataset, for
example, we achieve an FID of 2.01 and an Inception score of 9.96. Our method
also demonstrates competitive performance on CelebA-HQ-256 dataset as it
obtains an FID score of 7.11. Code is available at
https://github.com/MohsenZand/DiNof.
Related papers
- Distributional Diffusion Models with Scoring Rules [83.38210785728994]
Diffusion models generate high-quality synthetic data.
generating high-quality outputs requires many discretization steps.
We propose to accomplish sample generation by learning the posterior em distribution of clean data samples.
arXiv Detail & Related papers (2025-02-04T16:59:03Z) - Arbitrary-steps Image Super-resolution via Diffusion Inversion [68.78628844966019]
This study presents a new image super-resolution (SR) technique based on diffusion inversion, aiming at harnessing the rich image priors encapsulated in large pre-trained diffusion models to improve SR performance.
We design a Partial noise Prediction strategy to construct an intermediate state of the diffusion model, which serves as the starting sampling point.
Once trained, this noise predictor can be used to initialize the sampling process partially along the diffusion trajectory, generating the desirable high-resolution result.
arXiv Detail & Related papers (2024-12-12T07:24:13Z) - Diffusion State-Guided Projected Gradient for Inverse Problems [82.24625224110099]
We propose Diffusion State-Guided Projected Gradient (DiffStateGrad) for inverse problems.
DiffStateGrad projects the measurement gradient onto a subspace that is a low-rank approximation of an intermediate state of the diffusion process.
We highlight that DiffStateGrad improves the robustness of diffusion models in terms of the choice of measurement guidance step size and noise.
arXiv Detail & Related papers (2024-10-04T14:26:54Z) - Boosting Diffusion Models with Moving Average Sampling in Frequency Domain [101.43824674873508]
Diffusion models rely on the current sample to denoise the next one, possibly resulting in denoising instability.
In this paper, we reinterpret the iterative denoising process as model optimization and leverage a moving average mechanism to ensemble all the prior samples.
We name the complete approach "Moving Average Sampling in Frequency domain (MASF)"
arXiv Detail & Related papers (2024-03-26T16:57:55Z) - Blackout Diffusion: Generative Diffusion Models in Discrete-State Spaces [0.0]
We develop a theoretical formulation for arbitrary discrete-state Markov processes in the forward diffusion process.
As an example, we introduce Blackout Diffusion'', which learns to produce samples from an empty image instead of from noise.
arXiv Detail & Related papers (2023-05-18T16:24:12Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Truncated Diffusion Probabilistic Models and Diffusion-based Adversarial
Auto-Encoders [137.1060633388405]
Diffusion-based generative models learn how to generate the data by inferring a reverse diffusion chain.
We propose a faster and cheaper approach that adds noise not until the data become pure random noise.
We show that the proposed model can be cast as an adversarial auto-encoder empowered by both the diffusion process and a learnable implicit prior.
arXiv Detail & Related papers (2022-02-19T20:18:49Z) - Diffusion Normalizing Flow [4.94950858749529]
We present a novel generative modeling method called diffusion normalizing flow based on differential equations (SDEs)
The algorithm consists of two neural SDEs: a forward SDE that gradually adds noise to the data to transform the data into Gaussian random noise, and a backward SDE that gradually removes the noise to sample from the data distribution.
Our algorithm demonstrates competitive performance in both high-dimension data density estimation and image generation tasks.
arXiv Detail & Related papers (2021-10-14T17:41:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.