Diffusion Bridge Mixture Transports, Schr\"odinger Bridge Problems and
Generative Modeling
- URL: http://arxiv.org/abs/2304.00917v2
- Date: Fri, 22 Dec 2023 10:25:03 GMT
- Title: Diffusion Bridge Mixture Transports, Schr\"odinger Bridge Problems and
Generative Modeling
- Authors: Stefano Peluchetti
- Abstract summary: We propose a novel sampling-based iterative algorithm, the iterated diffusion bridge mixture (IDBM) procedure, aimed at solving the dynamic Schr"odinger bridge problem.
The IDBM procedure exhibits the attractive property of realizing a valid transport between the target probability measures at each iteration.
- Score: 4.831663144935879
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The dynamic Schr\"odinger bridge problem seeks a stochastic process that
defines a transport between two target probability measures, while optimally
satisfying the criteria of being closest, in terms of Kullback-Leibler
divergence, to a reference process. We propose a novel sampling-based iterative
algorithm, the iterated diffusion bridge mixture (IDBM) procedure, aimed at
solving the dynamic Schr\"odinger bridge problem. The IDBM procedure exhibits
the attractive property of realizing a valid transport between the target
probability measures at each iteration. We perform an initial theoretical
investigation of the IDBM procedure, establishing its convergence properties.
The theoretical findings are complemented by numerical experiments illustrating
the competitive performance of the IDBM procedure. Recent advancements in
generative modeling employ the time-reversal of a diffusion process to define a
generative process that approximately transports a simple distribution to the
data distribution. As an alternative, we propose utilizing the first iteration
of the IDBM procedure as an approximation-free method for realizing this
transport. This approach offers greater flexibility in selecting the generative
process dynamics and exhibits accelerated training and superior sample quality
over larger discretization intervals. In terms of implementation, the necessary
modifications are minimally intrusive, being limited to the training loss
definition.
Related papers
- Solving Prior Distribution Mismatch in Diffusion Models via Optimal Transport [24.90486913773359]
In recent years, the knowledge surrounding diffusion models(DMs) has grown significantly, though several theoretical gaps remain.
This paper explores the deeper relationship between optimal transport(OT) theory and DMs with discrete initial distribution.
We prove that as the diffusion termination time increases, the probability flow exponentially converges to the gradient of the solution to the classical Monge-Ampere equation.
arXiv Detail & Related papers (2024-10-17T10:54:55Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Variational Schrödinger Diffusion Models [14.480273869571468]
Schr"odinger bridge (SB) has emerged as the go-to method for optimizing transportation plans in diffusion models.
We leverage variational inference to linearize the forward score functions (variational scores) of SB.
We propose the variational Schr"odinger diffusion model (VSDM), where the forward process is a multivariate diffusion and the variational scores are adaptively optimized for efficient transport.
arXiv Detail & Related papers (2024-05-08T04:01:40Z) - Space-Time Diffusion Bridge [0.4527270266697462]
We introduce a novel method for generating new synthetic samples independent and identically distributed from real probability distributions.
We use space-time mixing strategies that extend across temporal and spatial dimensions.
We validate the efficacy of our space-time diffusion approach with numerical experiments.
arXiv Detail & Related papers (2024-02-13T23:26:11Z) - SinSR: Diffusion-Based Image Super-Resolution in a Single Step [119.18813219518042]
Super-resolution (SR) methods based on diffusion models exhibit promising results.
But their practical application is hindered by the substantial number of required inference steps.
We propose a simple yet effective method for achieving single-step SR generation, named SinSR.
arXiv Detail & Related papers (2023-11-23T16:21:29Z) - Protein Design with Guided Discrete Diffusion [67.06148688398677]
A popular approach to protein design is to combine a generative model with a discriminative model for conditional sampling.
We propose diffusioN Optimized Sampling (NOS), a guidance method for discrete diffusion models.
NOS makes it possible to perform design directly in sequence space, circumventing significant limitations of structure-based methods.
arXiv Detail & Related papers (2023-05-31T16:31:24Z) - CamoDiffusion: Camouflaged Object Detection via Conditional Diffusion
Models [72.93652777646233]
Camouflaged Object Detection (COD) is a challenging task in computer vision due to the high similarity between camouflaged objects and their surroundings.
We propose a new paradigm that treats COD as a conditional mask-generation task leveraging diffusion models.
Our method, dubbed CamoDiffusion, employs the denoising process of diffusion models to iteratively reduce the noise of the mask.
arXiv Detail & Related papers (2023-05-29T07:49:44Z) - Diffusion Schr\"odinger Bridge Matching [36.95088080680221]
We introduce Iterative Markovian Fitting (IMF) and Diffusion Schr"odinger Bridge Matching (DSBM)
IMF is a new methodology for solving SB problems, and DSBM is a novel numerical algorithm for computing IMF iterates.
We demonstrate the performance of DSBM on a variety of problems.
arXiv Detail & Related papers (2023-03-29T16:59:22Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Entropic Neural Optimal Transport via Diffusion Processes [105.34822201378763]
We propose a novel neural algorithm for the fundamental problem of computing the entropic optimal transport (EOT) plan between continuous probability distributions.
Our algorithm is based on the saddle point reformulation of the dynamic version of EOT which is known as the Schr"odinger Bridge problem.
In contrast to the prior methods for large-scale EOT, our algorithm is end-to-end and consists of a single learning step.
arXiv Detail & Related papers (2022-11-02T14:35:13Z) - Simulating Diffusion Bridges with Score Matching [17.492131261495523]
We first show that the time-reversed diffusion bridge process can be simulated if one can time-reverse the unconditioned diffusion process.
We then consider another iteration of our proposed methodology to approximate the Doob's $h$-transform defining the diffusion bridge process.
arXiv Detail & Related papers (2021-11-14T05:18:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.