From Denoising Diffusions to Denoising Markov Models
- URL: http://arxiv.org/abs/2211.03595v3
- Date: Sun, 18 Feb 2024 22:25:54 GMT
- Title: From Denoising Diffusions to Denoising Markov Models
- Authors: Joe Benton, Yuyang Shi, Valentin De Bortoli, George Deligiannidis,
Arnaud Doucet
- Abstract summary: Denoising diffusions are state-of-the-art generative models exhibiting remarkable empirical performance.
We propose a unifying framework generalising this approach to a wide class of spaces and leading to an original extension of score matching.
- Score: 38.33676858989955
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Denoising diffusions are state-of-the-art generative models exhibiting
remarkable empirical performance. They work by diffusing the data distribution
into a Gaussian distribution and then learning to reverse this noising process
to obtain synthetic datapoints. The denoising diffusion relies on
approximations of the logarithmic derivatives of the noised data densities
using score matching. Such models can also be used to perform approximate
posterior simulation when one can only sample from the prior and likelihood. We
propose a unifying framework generalising this approach to a wide class of
spaces and leading to an original extension of score matching. We illustrate
the resulting models on various applications.
Related papers
- On the Relation Between Linear Diffusion and Power Iteration [42.158089783398616]
We study the generation process as a correlation machine''
We show that low frequencies emerge earlier in the generation process, where the denoising basis vectors are more aligned to the true data with a rate depending on their eigenvalues.
This model allows us to show that the linear diffusion model converges in mean to the leading eigenvector of the underlying data, similarly to the prevalent power iteration method.
arXiv Detail & Related papers (2024-10-16T07:33:12Z) - Generative Diffusion From An Action Principle [0.0]
We show that score matching can be derived from an action principle, like the ones commonly used in physics.
We use this insight to demonstrate the connection between different classes of diffusion models.
arXiv Detail & Related papers (2023-10-06T18:00:00Z) - Soft Mixture Denoising: Beyond the Expressive Bottleneck of Diffusion
Models [76.46246743508651]
We show that current diffusion models actually have an expressive bottleneck in backward denoising.
We introduce soft mixture denoising (SMD), an expressive and efficient model for backward denoising.
arXiv Detail & Related papers (2023-09-25T12:03:32Z) - Generating observation guided ensembles for data assimilation with
denoising diffusion probabilistic model [0.0]
This paper presents an ensemble data assimilation method using the pseudo ensembles generated by denoising diffusion probabilistic model.
Thanks to the variance in generated ensembles, our proposed method displays better performance than the well-established ensemble data assimilation method when the simulation model is imperfect.
arXiv Detail & Related papers (2023-08-13T07:55:46Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - To smooth a cloud or to pin it down: Guarantees and Insights on Score Matching in Denoising Diffusion Models [20.315727650065007]
Denoising diffusion models are a class of generative models which have recently achieved state-of-the-art results across many domains.
We leverage known connections to control akin to the F"ollmer drift to extend established neural network approximation results for the F"ollmer drift to denoising diffusion models and samplers.
arXiv Detail & Related papers (2023-05-16T16:56:19Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Denoising Diffusion Samplers [41.796349001299156]
Denoising diffusion models are a popular class of generative models providing state-of-the-art results in many domains.
We explore a similar idea to sample approximately from unnormalized probability density functions and estimate their normalizing constants.
While score matching is not applicable in this context, we can leverage many of the ideas introduced in generative modeling for Monte Carlo sampling.
arXiv Detail & Related papers (2023-02-27T14:37:16Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Estimating High Order Gradients of the Data Distribution by Denoising [81.24581325617552]
First order derivative of a data density can be estimated efficiently by denoising score matching.
We propose a method to directly estimate high order derivatives (scores) of a data density from samples.
arXiv Detail & Related papers (2021-11-08T18:59:23Z) - Generative Modeling with Denoising Auto-Encoders and Langevin Sampling [88.83704353627554]
We show that both DAE and DSM provide estimates of the score of the smoothed population density.
We then apply our results to the homotopy method of arXiv:1907.05600 and provide theoretical justification for its empirical success.
arXiv Detail & Related papers (2020-01-31T23:50:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.