Applying Regularized Schr\"odinger-Bridge-Based Stochastic Process in
Generative Modeling
- URL: http://arxiv.org/abs/2208.07131v1
- Date: Mon, 15 Aug 2022 11:52:33 GMT
- Title: Applying Regularized Schr\"odinger-Bridge-Based Stochastic Process in
Generative Modeling
- Authors: Ki-Ung Song
- Abstract summary: This study tries to reduce the number of timesteps and training time required and proposed regularization terms to make bidirectional processes consistent with a reduced number of timesteps.
Applying this regularization to various tasks, the possibility of generative modeling based on a process with faster sampling speed could be confirmed.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Compared to the existing function-based models in deep generative modeling,
the recently proposed diffusion models have achieved outstanding performance
with a stochastic-process-based approach. But a long sampling time is required
for this approach due to many timesteps for discretization. Schr\"odinger
bridge (SB)-based models attempt to tackle this problem by training
bidirectional stochastic processes between distributions. However, they still
have a slow sampling speed compared to generative models such as generative
adversarial networks. And due to the training of the bidirectional stochastic
processes, they require a relatively long training time. Therefore, this study
tried to reduce the number of timesteps and training time required and proposed
regularization terms to the existing SB models to make the bidirectional
stochastic processes consistent and stable with a reduced number of timesteps.
Each regularization term was integrated into a single term to enable more
efficient training in computation time and memory usage. Applying this
regularized stochastic process to various generation tasks, the desired
translations between different distributions were obtained, and accordingly,
the possibility of generative modeling based on a stochastic process with
faster sampling speed could be confirmed. The code is available at
https://github.com/KiUngSong/RSB.
Related papers
- One Step Diffusion via Shortcut Models [109.72495454280627]
We introduce shortcut models, a family of generative models that use a single network and training phase to produce high-quality samples.
Shortcut models condition the network on the current noise level and also on the desired step size, allowing the model to skip ahead in the generation process.
Compared to distillation, shortcut models reduce complexity to a single network and training phase and additionally allow varying step budgets at inference time.
arXiv Detail & Related papers (2024-10-16T13:34:40Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Stable generative modeling using Schrödinger bridges [0.22499166814992438]
We propose a generative model combining Schr"odinger bridges and Langevin dynamics.
Our framework can be naturally extended to generate conditional samples and to Bayesian inference problems.
arXiv Detail & Related papers (2024-01-09T06:15:45Z) - Stochastic Interpolants: A Unifying Framework for Flows and Diffusions [16.95541777254722]
A class of generative models that unifies flow-based and diffusion-based methods is introduced.
These models extend the framework proposed in Albergo & VandenEijnden (2023), enabling the use of a broad class of continuous-time processes called stochastic interpolants'
These interpolants are built by combining data from the two prescribed densities with an additional latent variable that shapes the bridge in a flexible way.
arXiv Detail & Related papers (2023-03-15T17:43:42Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Markov Chain Monte Carlo for Continuous-Time Switching Dynamical Systems [26.744964200606784]
We propose a novel inference algorithm utilizing a Markov Chain Monte Carlo approach.
The presented Gibbs sampler allows to efficiently obtain samples from the exact continuous-time posterior processes.
arXiv Detail & Related papers (2022-05-18T09:03:00Z) - Traversing Time with Multi-Resolution Gaussian Process State-Space
Models [17.42262122708566]
We propose a novel Gaussian process state-space architecture composed of multiple components, each trained on a different resolution, to model effects on different timescales.
We benchmark our novel method on semi-synthetic data and on an engine modeling task.
In both experiments, our approach compares favorably against its state-of-the-art alternatives that operate on a single time-scale only.
arXiv Detail & Related papers (2021-12-06T18:39:27Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Score-Based Generative Modeling through Stochastic Differential
Equations [114.39209003111723]
We present a differential equation that transforms a complex data distribution to a known prior distribution by injecting noise.
A corresponding reverse-time SDE transforms the prior distribution back into the data distribution by slowly removing the noise.
By leveraging advances in score-based generative modeling, we can accurately estimate these scores with neural networks.
We demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.
arXiv Detail & Related papers (2020-11-26T19:39:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.