Deep Generative Learning via Schr\"{o}dinger Bridge
- URL: http://arxiv.org/abs/2106.10410v1
- Date: Sat, 19 Jun 2021 03:35:42 GMT
- Title: Deep Generative Learning via Schr\"{o}dinger Bridge
- Authors: Gefei Wang, Yuling Jiao, Qian Xu, Yang Wang, Can Yang
- Abstract summary: We learn a generative model via entropy with a Schr"odinger Bridge.
We show that the generative model via Schr"odinger Bridge is comparable with state-of-the-art GANs.
- Score: 14.138796631423954
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose to learn a generative model via entropy interpolation with a
Schr\"{o}dinger Bridge. The generative learning task can be formulated as
interpolating between a reference distribution and a target distribution based
on the Kullback-Leibler divergence. At the population level, this entropy
interpolation is characterized via an SDE on $[0,1]$ with a time-varying drift
term. At the sample level, we derive our Schr\"{o}dinger Bridge algorithm by
plugging the drift term estimated by a deep score estimator and a deep density
ratio estimator into the Euler-Maruyama method. Under some mild smoothness
assumptions of the target distribution, we prove the consistency of both the
score estimator and the density ratio estimator, and then establish the
consistency of the proposed Schr\"{o}dinger Bridge approach. Our theoretical
results guarantee that the distribution learned by our approach converges to
the target distribution. Experimental results on multimodal synthetic data and
benchmark data support our theoretical findings and indicate that the
generative model via Schr\"{o}dinger Bridge is comparable with state-of-the-art
GANs, suggesting a new formulation of generative learning. We demonstrate its
usefulness in image interpolation and image inpainting.
Related papers
- Latent Schrodinger Bridge: Prompting Latent Diffusion for Fast Unpaired Image-to-Image Translation [58.19676004192321]
Diffusion models (DMs), which enable both image generation from noise and inversion from data, have inspired powerful unpaired image-to-image (I2I) translation algorithms.
We tackle this problem with Schrodinger Bridges (SBs), which are differential equations (SDEs) between distributions with minimal transport cost.
Inspired by this observation, we propose Latent Schrodinger Bridges (LSBs) that approximate the SB ODE via pre-trained Stable Diffusion.
We demonstrate that our algorithm successfully conduct competitive I2I translation in unsupervised setting with only a fraction of cost required by previous DM-
arXiv Detail & Related papers (2024-11-22T11:24:14Z) - Latent Schr{รถ}dinger Bridge Diffusion Model for Generative Learning [7.13080924844185]
We introduce a novel generative learning methodology utilizing the Schr"odinger bridge diffusion model in latent space.
We develop a diffusion model within the latent space utilizing the Schr"odinger bridge framework.
arXiv Detail & Related papers (2024-04-20T07:38:48Z) - Soft-constrained Schrodinger Bridge: a Stochastic Control Approach [4.922305511803267]
Schr"odinger bridge can be viewed as a continuous-time control problem where the goal is to find an optimally controlled diffusion process.
We propose to generalize this problem by allowing the terminal distribution to differ from the target but penalizing the Kullback-Leibler divergence between the two distributions.
One application is the development of robust generative diffusion models.
arXiv Detail & Related papers (2024-03-04T04:10:24Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Density Ratio Estimation via Infinitesimal Classification [85.08255198145304]
We propose DRE-infty, a divide-and-conquer approach to reduce Density ratio estimation (DRE) to a series of easier subproblems.
Inspired by Monte Carlo methods, we smoothly interpolate between the two distributions via an infinite continuum of intermediate bridge distributions.
We show that our approach performs well on downstream tasks such as mutual information estimation and energy-based modeling on complex, high-dimensional datasets.
arXiv Detail & Related papers (2021-11-22T06:26:29Z) - Generative Learning With Euler Particle Transport [14.557451744544592]
We propose an Euler particle transport (EPT) approach for generative learning.
The proposed approach is motivated by the problem of finding an optimal transport map from a reference distribution to a target distribution.
We show that the proposed density-ratio (difference) estimators do not suffer from the "curse of dimensionality" if data is supported on a lower-dimensional manifold.
arXiv Detail & Related papers (2020-12-11T03:10:53Z) - Learning Implicit Generative Models with Theoretical Guarantees [12.761710596142109]
We propose a textbfunified textbfframework for textbfimplicit textbfmodeling (UnifiGem)
UnifiGem integrates approaches from optimal transport, numerical ODE, density-ratio (density-difference) estimation and deep neural networks.
Experimental results on both synthetic datasets and real benchmark datasets support our theoretical findings and demonstrate the effectiveness of UnifiGem.
arXiv Detail & Related papers (2020-02-07T15:55:48Z) - Generative Modeling with Denoising Auto-Encoders and Langevin Sampling [88.83704353627554]
We show that both DAE and DSM provide estimates of the score of the smoothed population density.
We then apply our results to the homotopy method of arXiv:1907.05600 and provide theoretical justification for its empirical success.
arXiv Detail & Related papers (2020-01-31T23:50:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.