A Flexible Diffusion Model
- URL: http://arxiv.org/abs/2206.10365v1
- Date: Fri, 17 Jun 2022 06:46:58 GMT
- Title: A Flexible Diffusion Model
- Authors: Weitao Du, Tao Yang, He Zhang, Yuanqi Du
- Abstract summary: We propose a framework for parameterizing the diffusion model, especially the spatial part of the forward SDE.
An abstract formalism is introduced with theoretical guarantees, and its connection with previous diffusion models is leveraged.
- Score: 18.723160658185115
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Diffusion (score-based) generative models have been widely used for modeling
various types of complex data, including images, audios, and point clouds.
Recently, the deep connection between forward-backward stochastic differential
equations (SDEs) and diffusion-based models has been revealed, and several new
variants of SDEs are proposed (e.g., sub-VP, critically-damped Langevin) along
this line. Despite the empirical success of the hand-crafted fixed forward
SDEs, a great quantity of proper forward SDEs remain unexplored. In this work,
we propose a general framework for parameterizing the diffusion model,
especially the spatial part of the forward SDE. An abstract formalism is
introduced with theoretical guarantees, and its connection with previous
diffusion models is leveraged. We demonstrate the theoretical advantage of our
method from an optimization perspective. Numerical experiments on synthetic
datasets, MINIST and CIFAR10 are also presented to validate the effectiveness
of our framework.
Related papers
- Convergence of Diffusion Models Under the Manifold Hypothesis in High-Dimensions [6.9408143976091745]
Denoising Diffusion Probabilistic Models (DDPM) are powerful state-of-the-art methods used to generate synthetic data from high-dimensional data distributions.
We study DDPMs under the manifold hypothesis and prove that they achieve rates independent of the ambient dimension in terms of learning the score.
In terms of sampling, we obtain rates independent of the ambient dimension w.r.t. the Kullback-Leibler divergence, and $O(sqrtD)$ w.r.t. the Wasserstein distance.
arXiv Detail & Related papers (2024-09-27T14:57:18Z) - AdjointDEIS: Efficient Gradients for Diffusion Models [2.0795007613453445]
We show that continuous adjoint equations for diffusion SDEs actually simplify to a simple ODE.
We also demonstrate the effectiveness of AdjointDEIS for guided generation with an adversarial attack in the form of the face morphing problem.
arXiv Detail & Related papers (2024-05-23T19:51:33Z) - An Overview of Diffusion Models: Applications, Guided Generation, Statistical Rates and Optimization [59.63880337156392]
Diffusion models have achieved tremendous success in computer vision, audio, reinforcement learning, and computational biology.
Despite the significant empirical success, theory of diffusion models is very limited.
This paper provides a well-rounded theoretical exposure for stimulating forward-looking theories and methods of diffusion models.
arXiv Detail & Related papers (2024-04-11T14:07:25Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Learning Space-Time Continuous Neural PDEs from Partially Observed
States [13.01244901400942]
We introduce a grid-independent model learning partial differential equations (PDEs) from noisy and partial observations on irregular grids.
We propose a space-time continuous latent neural PDE model with an efficient probabilistic framework and a novel design encoder for improved data efficiency and grid independence.
arXiv Detail & Related papers (2023-07-09T06:53:59Z) - Variance-Preserving-Based Interpolation Diffusion Models for Speech
Enhancement [53.2171981279647]
We present a framework that encapsulates both the VP- and variance-exploding (VE)-based diffusion methods.
To improve performance and ease model training, we analyze the common difficulties encountered in diffusion models.
We evaluate our model against several methods using a public benchmark to showcase the effectiveness of our approach.
arXiv Detail & Related papers (2023-06-14T14:22:22Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Score-based Generative Modeling Through Backward Stochastic Differential
Equations: Inversion and Generation [6.2255027793924285]
The proposed BSDE-based diffusion model represents a novel approach to diffusion modeling, which extends the application of differential equations (SDEs) in machine learning.
We demonstrate the theoretical guarantees of the model, the benefits of using Lipschitz networks for score matching, and its potential applications in various areas such as diffusion inversion, conditional diffusion, and uncertainty quantification.
arXiv Detail & Related papers (2023-04-26T01:15:35Z) - An optimal control perspective on diffusion-based generative modeling [9.806130366152194]
We establish a connection between optimal control and generative models based on differential equations (SDEs)
In particular, we derive a Hamilton-Jacobi-Bellman equation that governs the evolution of the log-densities of the underlying SDE marginals.
We develop a novel diffusion-based method for sampling from unnormalized densities.
arXiv Detail & Related papers (2022-11-02T17:59:09Z) - A Survey on Generative Diffusion Model [75.93774014861978]
Diffusion models are an emerging class of deep generative models.
They have certain limitations, including a time-consuming iterative generation process and confinement to high-dimensional Euclidean space.
This survey presents a plethora of advanced techniques aimed at enhancing diffusion models.
arXiv Detail & Related papers (2022-09-06T16:56:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.