Diffusion Schr\"odinger Bridge with Applications to Score-Based
Generative Modeling
- URL: http://arxiv.org/abs/2106.01357v5
- Date: Wed, 5 Apr 2023 09:40:05 GMT
- Title: Diffusion Schr\"odinger Bridge with Applications to Score-Based
Generative Modeling
- Authors: Valentin De Bortoli, James Thornton, Jeremy Heng, Arnaud Doucet
- Abstract summary: Diffusion SB is an original approximation of the Iterative Proportional Fitting (IPF) procedure to solve the Schr"odinger Bridge problem.
We present Diffusion SB, an original approximation of the Iterative Proportional Fitting (IPF) procedure to solve the SB problem, and provide theoretical analysis along with generative modeling experiments.
- Score: 24.46142828617484
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Progressively applying Gaussian noise transforms complex data distributions
to approximately Gaussian. Reversing this dynamic defines a generative model.
When the forward noising process is given by a Stochastic Differential Equation
(SDE), Song et al. (2021) demonstrate how the time inhomogeneous drift of the
associated reverse-time SDE may be estimated using score-matching. A limitation
of this approach is that the forward-time SDE must be run for a sufficiently
long time for the final distribution to be approximately Gaussian. In contrast,
solving the Schr\"odinger Bridge problem (SB), i.e. an entropy-regularized
optimal transport problem on path spaces, yields diffusions which generate
samples from the data distribution in finite time. We present Diffusion SB
(DSB), an original approximation of the Iterative Proportional Fitting (IPF)
procedure to solve the SB problem, and provide theoretical analysis along with
generative modeling experiments. The first DSB iteration recovers the
methodology proposed by Song et al. (2021), with the flexibility of using
shorter time intervals, as subsequent DSB iterations reduce the discrepancy
between the final-time marginal of the forward (resp. backward) SDE with
respect to the prior (resp. data) distribution. Beyond generative modeling, DSB
offers a widely applicable computational optimal transport tool as the
continuous state-space analogue of the popular Sinkhorn algorithm (Cuturi,
2013).
Related papers
- Convergence of Score-Based Discrete Diffusion Models: A Discrete-Time Analysis [56.442307356162864]
We study the theoretical aspects of score-based discrete diffusion models under the Continuous Time Markov Chain (CTMC) framework.
We introduce a discrete-time sampling algorithm in the general state space $[S]d$ that utilizes score estimators at predefined time points.
Our convergence analysis employs a Girsanov-based method and establishes key properties of the discrete score function.
arXiv Detail & Related papers (2024-10-03T09:07:13Z) - Non-asymptotic bounds for forward processes in denoising diffusions: Ornstein-Uhlenbeck is hard to beat [49.1574468325115]
This paper presents explicit non-asymptotic bounds on the forward diffusion error in total variation (TV)
We parametrise multi-modal data distributions in terms of the distance $R$ to their furthest modes and consider forward diffusions with additive and multiplicative noise.
arXiv Detail & Related papers (2024-08-25T10:28:31Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Exploring the Optimal Choice for Generative Processes in Diffusion
Models: Ordinary vs Stochastic Differential Equations [6.2284442126065525]
We study the problem mathematically for two limiting scenarios: the zero diffusion (ODE) case and the large diffusion case.
Our findings indicate that when the perturbation occurs at the end of the generative process, the ODE model outperforms the SDE model with a large diffusion coefficient.
arXiv Detail & Related papers (2023-06-03T09:27:15Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Denoising Diffusion Samplers [41.796349001299156]
Denoising diffusion models are a popular class of generative models providing state-of-the-art results in many domains.
We explore a similar idea to sample approximately from unnormalized probability density functions and estimate their normalizing constants.
While score matching is not applicable in this context, we can leverage many of the ideas introduced in generative modeling for Monte Carlo sampling.
arXiv Detail & Related papers (2023-02-27T14:37:16Z) - Diffusion Normalizing Flow [4.94950858749529]
We present a novel generative modeling method called diffusion normalizing flow based on differential equations (SDEs)
The algorithm consists of two neural SDEs: a forward SDE that gradually adds noise to the data to transform the data into Gaussian random noise, and a backward SDE that gradually removes the noise to sample from the data distribution.
Our algorithm demonstrates competitive performance in both high-dimension data density estimation and image generation tasks.
arXiv Detail & Related papers (2021-10-14T17:41:12Z) - Distributed stochastic optimization with large delays [59.95552973784946]
One of the most widely used methods for solving large-scale optimization problems is distributed asynchronous gradient descent (DASGD)
We show that DASGD converges to a global optimal implementation model under same delay assumptions.
arXiv Detail & Related papers (2021-07-06T21:59:49Z) - A Variational Perspective on Diffusion-Based Generative Models and Score
Matching [8.93483643820767]
We derive a variational framework for likelihood estimation for continuous-time generative diffusion.
We show that minimizing the score-matching loss is equivalent to maximizing a lower bound of the likelihood of the plug-in reverse SDE.
arXiv Detail & Related papers (2021-06-05T05:50:36Z) - Score-Based Generative Modeling through Stochastic Differential
Equations [114.39209003111723]
We present a differential equation that transforms a complex data distribution to a known prior distribution by injecting noise.
A corresponding reverse-time SDE transforms the prior distribution back into the data distribution by slowly removing the noise.
By leveraging advances in score-based generative modeling, we can accurately estimate these scores with neural networks.
We demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.
arXiv Detail & Related papers (2020-11-26T19:39:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.