Star-Shaped Denoising Diffusion Probabilistic Models
- URL: http://arxiv.org/abs/2302.05259v3
- Date: Sat, 28 Oct 2023 21:50:10 GMT
- Title: Star-Shaped Denoising Diffusion Probabilistic Models
- Authors: Andrey Okhotin, Dmitry Molchanov, Vladimir Arkhipkin, Grigory Bartosh,
Viktor Ohanesian, Aibek Alanov, Dmitry Vetrov
- Abstract summary: We introduce Star-Shaped DDPM (SSDDPM)
Our implementation is available at https://github.com/andreyokhotin/star-shaped.
- Score: 5.167803438665587
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Denoising Diffusion Probabilistic Models (DDPMs) provide the foundation for
the recent breakthroughs in generative modeling. Their Markovian structure
makes it difficult to define DDPMs with distributions other than Gaussian or
discrete. In this paper, we introduce Star-Shaped DDPM (SS-DDPM). Its
star-shaped diffusion process allows us to bypass the need to define the
transition probabilities or compute posteriors. We establish duality between
star-shaped and specific Markovian diffusions for the exponential family of
distributions and derive efficient algorithms for training and sampling from
SS-DDPMs. In the case of Gaussian distributions, SS-DDPM is equivalent to DDPM.
However, SS-DDPMs provide a simple recipe for designing diffusion models with
distributions such as Beta, von Mises$\unicode{x2013}$Fisher, Dirichlet,
Wishart and others, which can be especially useful when data lies on a
constrained manifold. We evaluate the model in different settings and find it
competitive even on image data, where Beta SS-DDPM achieves results comparable
to a Gaussian DDPM. Our implementation is available at
https://github.com/andrey-okhotin/star-shaped .
Related papers
- Denoising Lévy Probabilistic Models [28.879024667933194]
Recent studies suggest that heavy-tailed noise distributions, like $alpha$-stable distributions, may better handle mode collapse.
We extend the denoising diffusion probabilistic model (DDPM) by replacing the Gaussian noise with $alpha$-stable noise.
Our experiments show improvements in coverage of data distribution tails, better robustness to unbalanced datasets, and improved computation times.
arXiv Detail & Related papers (2024-07-26T09:00:18Z) - Going beyond Compositions, DDPMs Can Produce Zero-Shot Interpolations [54.95457207525101]
Denoising Diffusion Probabilistic Models (DDPMs) exhibit remarkable capabilities in image generation.
We study DDPMs trained on strictly separate subsets of the data distribution with large gaps on the support of latent factors.
We show that such a model can effectively generate images in the unexplored, intermediate regions of the distribution.
arXiv Detail & Related papers (2024-05-29T15:41:53Z) - DPM-OT: A New Diffusion Probabilistic Model Based on Optimal Transport [26.713392774427653]
DPM-OT is a unified learning framework for fast DPMs with a direct expressway represented by OT map.
It can generate high-quality samples within around 10 function evaluations.
Experiments validate the effectiveness and advantages of DPM-OT in terms of speed and quality.
arXiv Detail & Related papers (2023-07-21T02:28:54Z) - Semi-Implicit Denoising Diffusion Models (SIDDMs) [50.30163684539586]
Existing models such as Denoising Diffusion Probabilistic Models (DDPM) deliver high-quality, diverse samples but are slowed by an inherently high number of iterative steps.
We introduce a novel approach that tackles the problem by matching implicit and explicit factors.
We demonstrate that our proposed method obtains comparable generative performance to diffusion-based models and vastly superior results to models with a small number of sampling steps.
arXiv Detail & Related papers (2023-06-21T18:49:22Z) - UDPM: Upsampling Diffusion Probabilistic Models [33.51145642279836]
Denoising Diffusion Probabilistic Models (DDPM) have recently gained significant attention.
DDPMs generate high-quality samples from complex data distributions by defining an inverse process.
Unlike generative adversarial networks (GANs), the latent space of diffusion models is less interpretable.
In this work, we propose to generalize the denoising diffusion process into an Upsampling Diffusion Probabilistic Model (UDPM)
arXiv Detail & Related papers (2023-05-25T17:25:14Z) - On Calibrating Diffusion Probabilistic Models [78.75538484265292]
diffusion probabilistic models (DPMs) have achieved promising results in diverse generative tasks.
We propose a simple way for calibrating an arbitrary pretrained DPM, with which the score matching loss can be reduced and the lower bounds of model likelihood can be increased.
Our calibration method is performed only once and the resulting models can be used repeatedly for sampling.
arXiv Detail & Related papers (2023-02-21T14:14:40Z) - Accelerating Diffusion Models via Early Stop of the Diffusion Process [114.48426684994179]
Denoising Diffusion Probabilistic Models (DDPMs) have achieved impressive performance on various generation tasks.
In practice DDPMs often need hundreds even thousands of denoising steps to obtain a high-quality sample.
We propose a principled acceleration strategy, referred to as Early-Stopped DDPM (ES-DDPM), for DDPMs.
arXiv Detail & Related papers (2022-05-25T06:40:09Z) - Pseudo Numerical Methods for Diffusion Models on Manifolds [77.40343577960712]
Denoising Diffusion Probabilistic Models (DDPMs) can generate high-quality samples such as image and audio samples.
DDPMs require hundreds to thousands of iterations to produce final samples.
We propose pseudo numerical methods for diffusion models (PNDMs)
PNDMs can generate higher quality synthetic images with only 50 steps compared with 1000-step DDIMs (20x speedup)
arXiv Detail & Related papers (2022-02-20T10:37:52Z) - Improved Denoising Diffusion Probabilistic Models [4.919647298882951]
We show that DDPMs can achieve competitive log-likelihoods while maintaining high sample quality.
We also find that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes.
We show that the sample quality and likelihood of these models scale smoothly with model capacity and training compute, making them easily scalable.
arXiv Detail & Related papers (2021-02-18T23:44:17Z) - Denoising Diffusion Implicit Models [117.03720513930335]
We present denoising diffusion implicit models (DDIMs) for iterative implicit probabilistic models with the same training procedure as DDPMs.
DDIMs can produce high quality samples $10 times$ to $50 times$ faster in terms of wall-clock time compared to DDPMs.
arXiv Detail & Related papers (2020-10-06T06:15:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.