Sequential Flow Straightening for Generative Modeling
- URL: http://arxiv.org/abs/2402.06461v2
- Date: Thu, 15 Feb 2024 00:44:01 GMT
- Title: Sequential Flow Straightening for Generative Modeling
- Authors: Jongmin Yoon, and Juho Lee
- Abstract summary: We propose SeqRF, a learning technique that straightens the probability flow to reduce the global truncation error.
We achieve surpassing results on CIFAR-10, CelebA-$64 times 64$, and LSUN-Church datasets.
- Score: 14.521246785215808
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Straightening the probability flow of the continuous-time generative models,
such as diffusion models or flow-based models, is the key to fast sampling
through the numerical solvers, existing methods learn a linear path by directly
generating the probability path the joint distribution between the noise and
data distribution. One key reason for the slow sampling speed of the ODE-based
solvers that simulate these generative models is the global truncation error of
the ODE solver, caused by the high curvature of the ODE trajectory, which
explodes the truncation error of the numerical solvers in the low-NFE regime.
To address this challenge, We propose a novel method called SeqRF, a learning
technique that straightens the probability flow to reduce the global truncation
error and hence enable acceleration of sampling and improve the synthesis
quality. In both theoretical and empirical studies, we first observe the
straightening property of our SeqRF. Through empirical evaluations via SeqRF
over flow-based generative models, We achieve surpassing results on CIFAR-10,
CelebA-$64 \times 64$, and LSUN-Church datasets.
Related papers
- Straightness of Rectified Flow: A Theoretical Insight into Wasserstein Convergence [54.580605276017096]
Diffusion models have emerged as a powerful tool for image generation and denoising.
Recently, Liu et al. designed a novel alternative generative model Rectified Flow (RF)
RF aims to learn straight flow trajectories from noise to data using a sequence of convex optimization problems.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Characteristic Learning for Provable One Step Generation [3.0457054308731215]
We propose a one-step generative model that combines the efficiency of sampling in Generative Adversarial Networks (GANs) with the stable performance of flow-based models.
Our model is driven by characteristics, along which the probability density transport can be described by ordinary differential equations (ODEs)
We analyze the errors in velocity matching, Euler discretization, and characteristic fitting to establish a non-asymptotic convergence rate for the characteristic generator in 2-Wasserstein distance.
arXiv Detail & Related papers (2024-05-09T02:41:42Z) - A prior regularized full waveform inversion using generative diffusion
models [0.5156484100374059]
Full waveform inversion (FWI) has the potential to provide high-resolution subsurface model estimations.
Due to limitations in observation, e.g., regional noise, limited shots or receivers, and band-limited data, it is hard to obtain the desired high-resolution model with FWI.
We propose a new paradigm for FWI regularized by generative diffusion models.
arXiv Detail & Related papers (2023-06-22T10:10:34Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Minimizing Trajectory Curvature of ODE-based Generative Models [45.89620603363946]
Recent generative models, such as diffusion models, rectified flows, and flow matching, define a generative process as a time reversal of a fixed forward process.
We present an efficient method of training the forward process to minimize the curvature of generative trajectories without any ODE/SDE simulation.
arXiv Detail & Related papers (2023-01-27T21:52:03Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.