Generalised Scale-Space Properties for Probabilistic Diffusion Models
- URL: http://arxiv.org/abs/2303.07900v4
- Date: Mon, 18 Sep 2023 09:55:07 GMT
- Title: Generalised Scale-Space Properties for Probabilistic Diffusion Models
- Authors: Pascal Peter
- Abstract summary: We show that probabilistic diffusion models fulfil generalised scale-space properties on evolving probability distributions.
We discuss similarities and differences between interpretations of the physical core concept of drift-diffusion in the deep learning and model-based world.
- Score: 1.52292571922932
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Probabilistic diffusion models enjoy increasing popularity in the deep
learning community. They generate convincing samples from a learned
distribution of input images with a wide field of practical applications.
Originally, these approaches were motivated from drift-diffusion processes, but
these origins find less attention in recent, practice-oriented publications. We
investigate probabilistic diffusion models from the viewpoint of scale-space
research and show that they fulfil generalised scale-space properties on
evolving probability distributions. Moreover, we discuss similarities and
differences between interpretations of the physical core concept of
drift-diffusion in the deep learning and model-based world. To this end, we
examine relations of probabilistic diffusion to osmosis filters.
Related papers
- Sifting through the Noise: A Survey of Diffusion Probabilistic Models and Their Applications to Biomolecules [0.7366405857677227]
Diffusion probabilistic models have made their way into a number of high-profile applications.
This paper serves as a general overview for the theory behind these models and the current state of research.
arXiv Detail & Related papers (2024-05-31T21:39:51Z) - An Overview of Diffusion Models: Applications, Guided Generation, Statistical Rates and Optimization [59.63880337156392]
Diffusion models have achieved tremendous success in computer vision, audio, reinforcement learning, and computational biology.
Despite the significant empirical success, theory of diffusion models is very limited.
This paper provides a well-rounded theoretical exposure for stimulating forward-looking theories and methods of diffusion models.
arXiv Detail & Related papers (2024-04-11T14:07:25Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Non-Denoising Forward-Time Diffusions [4.831663144935879]
We show that the time-reversal argument, common to all denoising diffusion probabilistic modeling proposals, is not necessary.
We obtain diffusion processes targeting the desired data distribution by taking appropriate mixtures of diffusion bridges.
We develop a unifying view of the drift adjustments corresponding to our and to time-reversal approaches.
arXiv Detail & Related papers (2023-12-22T10:26:31Z) - On the Generalization Properties of Diffusion Models [33.93850788633184]
This work embarks on a comprehensive theoretical exploration of the generalization attributes of diffusion models.
We establish theoretical estimates of the generalization gap that evolves in tandem with the training dynamics of score-based diffusion models.
We extend our quantitative analysis to a data-dependent scenario, wherein target distributions are portrayed as a succession of densities.
arXiv Detail & Related papers (2023-11-03T09:20:20Z) - The Emergence of Reproducibility and Generalizability in Diffusion Models [10.188731323681575]
Given the same starting noise input and a deterministic sampler, different diffusion models often yield remarkably similar outputs.
We show that diffusion models are learning distinct distributions affected by the training data size.
This valuable property generalizes to many variants of diffusion models, including those for conditional use, solving inverse problems, and model fine-tuning.
arXiv Detail & Related papers (2023-10-08T19:02:46Z) - Generalised Diffusion Probabilistic Scale-Spaces [1.52292571922932]
Diffusion probabilistic models excel at sampling new images from learned distributions.
We propose a scale-space theory for diffusion probabilistic models.
We show conceptual and empirical connections to diffusion and osmosis filters.
arXiv Detail & Related papers (2023-09-15T16:17:54Z) - Diffusion Models are Minimax Optimal Distribution Estimators [49.47503258639454]
We provide the first rigorous analysis on approximation and generalization abilities of diffusion modeling.
We show that when the true density function belongs to the Besov space and the empirical score matching loss is properly minimized, the generated data distribution achieves the nearly minimax optimal estimation rates.
arXiv Detail & Related papers (2023-03-03T11:31:55Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - Diffusion Models in Vision: A Survey [80.82832715884597]
A diffusion model is a deep generative model that is based on two stages, a forward diffusion stage and a reverse diffusion stage.
Diffusion models are widely appreciated for the quality and diversity of the generated samples, despite their known computational burdens.
arXiv Detail & Related papers (2022-09-10T22:00:30Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.