Simulating Diffusion Bridges with Score Matching
- URL: http://arxiv.org/abs/2111.07243v1
- Date: Sun, 14 Nov 2021 05:18:31 GMT
- Title: Simulating Diffusion Bridges with Score Matching
- Authors: Valentin De Bortoli, Arnaud Doucet, Jeremy Heng, James Thornton
- Abstract summary: We first show that the time-reversed diffusion bridge process can be simulated if one can time-reverse the unconditioned diffusion process.
We then consider another iteration of our proposed methodology to approximate the Doob's $h$-transform defining the diffusion bridge process.
- Score: 17.492131261495523
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the problem of simulating diffusion bridges, i.e. diffusion
processes that are conditioned to initialize and terminate at two given states.
Diffusion bridge simulation has applications in diverse scientific fields and
plays a crucial role for statistical inference of discretely-observed
diffusions. This is known to be a challenging problem that has received much
attention in the last two decades. In this work, we first show that the
time-reversed diffusion bridge process can be simulated if one can time-reverse
the unconditioned diffusion process. We introduce a variational formulation to
learn this time-reversal that relies on a score matching method to circumvent
intractability. We then consider another iteration of our proposed methodology
to approximate the Doob's $h$-transform defining the diffusion bridge process.
As our approach is generally applicable under mild assumptions on the
underlying diffusion process, it can easily be used to improve the proposal
bridge process within existing methods and frameworks. We discuss algorithmic
considerations and extensions, and present some numerical results.
Related papers
- Solving Prior Distribution Mismatch in Diffusion Models via Optimal Transport [24.90486913773359]
In recent years, the knowledge surrounding diffusion models(DMs) has grown significantly, though several theoretical gaps remain.
This paper explores the deeper relationship between optimal transport(OT) theory and DMs with discrete initial distribution.
We prove that as the diffusion termination time increases, the probability flow exponentially converges to the gradient of the solution to the classical Monge-Ampere equation.
arXiv Detail & Related papers (2024-10-17T10:54:55Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems.
We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Simulating infinite-dimensional nonlinear diffusion bridges [1.747623282473278]
The diffusion bridge is a type of diffusion process that conditions on hitting a specific state within a finite time period.
We present a solution by merging score-matching techniques with operator learning, enabling a direct approach to score-matching for the infinite-dimensional bridge.
arXiv Detail & Related papers (2024-05-28T16:52:52Z) - Adversarial Schrödinger Bridge Matching [66.39774923893103]
Iterative Markovian Fitting (IMF) procedure alternates between Markovian and reciprocal projections of continuous-time processes.
We propose a novel Discrete-time IMF (D-IMF) procedure in which learning of processes is replaced by learning just a few transition probabilities in discrete time.
We show that our D-IMF procedure can provide the same quality of unpaired domain translation as the IMF, using only several generation steps instead of hundreds.
arXiv Detail & Related papers (2024-05-23T11:29:33Z) - Prompt-tuning latent diffusion models for inverse problems [72.13952857287794]
We propose a new method for solving imaging inverse problems using text-to-image latent diffusion models as general priors.
Our method, called P2L, outperforms both image- and latent-diffusion model-based inverse problem solvers on a variety of tasks, such as super-resolution, deblurring, and inpainting.
arXiv Detail & Related papers (2023-10-02T11:31:48Z) - Eliminating Lipschitz Singularities in Diffusion Models [51.806899946775076]
We show that diffusion models frequently exhibit the infinite Lipschitz near the zero point of timesteps.
This poses a threat to the stability and accuracy of the diffusion process, which relies on integral operations.
We propose a novel approach, dubbed E-TSDM, which eliminates the Lipschitz of the diffusion model near zero.
arXiv Detail & Related papers (2023-06-20T03:05:28Z) - Reconstructing Graph Diffusion History from a Single Snapshot [87.20550495678907]
We propose a novel barycenter formulation for reconstructing Diffusion history from A single SnapsHot (DASH)
We prove that estimation error of diffusion parameters is unavoidable due to NP-hardness of diffusion parameter estimation.
We also develop an effective solver named DIffusion hiTting Times with Optimal proposal (DITTO)
arXiv Detail & Related papers (2023-06-01T09:39:32Z) - Blackout Diffusion: Generative Diffusion Models in Discrete-State Spaces [0.0]
We develop a theoretical formulation for arbitrary discrete-state Markov processes in the forward diffusion process.
As an example, we introduce Blackout Diffusion'', which learns to produce samples from an empty image instead of from noise.
arXiv Detail & Related papers (2023-05-18T16:24:12Z) - Diffusion Bridge Mixture Transports, Schr\"odinger Bridge Problems and
Generative Modeling [4.831663144935879]
We propose a novel sampling-based iterative algorithm, the iterated diffusion bridge mixture (IDBM) procedure, aimed at solving the dynamic Schr"odinger bridge problem.
The IDBM procedure exhibits the attractive property of realizing a valid transport between the target probability measures at each iteration.
arXiv Detail & Related papers (2023-04-03T12:13:42Z) - Where to Diffuse, How to Diffuse, and How to Get Back: Automated
Learning for Multivariate Diffusions [22.04182099405728]
Diffusion-based generative models (DBGMs) perturb data to a target noise distribution and reverse this inference diffusion process to generate samples.
We show how to maximize a lower-bound on the likelihood for any number of auxiliary variables.
We then demonstrate how to parameterize the diffusion for a specified target noise distribution.
arXiv Detail & Related papers (2023-02-14T18:57:04Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.