Minimizing Trajectory Curvature of ODE-based Generative Models
- URL: http://arxiv.org/abs/2301.12003v3
- Date: Thu, 25 May 2023 11:33:13 GMT
- Title: Minimizing Trajectory Curvature of ODE-based Generative Models
- Authors: Sangyun Lee, Beomsu Kim, Jong Chul Ye
- Abstract summary: Recent generative models, such as diffusion models, rectified flows, and flow matching, define a generative process as a time reversal of a fixed forward process.
We present an efficient method of training the forward process to minimize the curvature of generative trajectories without any ODE/SDE simulation.
- Score: 45.89620603363946
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Recent ODE/SDE-based generative models, such as diffusion models, rectified
flows, and flow matching, define a generative process as a time reversal of a
fixed forward process. Even though these models show impressive performance on
large-scale datasets, numerical simulation requires multiple evaluations of a
neural network, leading to a slow sampling speed. We attribute the reason to
the high curvature of the learned generative trajectories, as it is directly
related to the truncation error of a numerical solver. Based on the
relationship between the forward process and the curvature, here we present an
efficient method of training the forward process to minimize the curvature of
generative trajectories without any ODE/SDE simulation. Experiments show that
our method achieves a lower curvature than previous models and, therefore,
decreased sampling costs while maintaining competitive performance. Code is
available at https://github.com/sangyun884/fast-ode.
Related papers
- Flow Map Matching [15.520853806024943]
Flow map matching is an algorithm that learns the two-time flow map of an underlying ordinary differential equation.
We show that flow map matching leads to high-quality samples with significantly reduced sampling cost compared to diffusion or interpolant methods.
arXiv Detail & Related papers (2024-06-11T17:41:26Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Sequential Flow Straightening for Generative Modeling [14.521246785215808]
We propose SeqRF, a learning technique that straightens the probability flow to reduce the global truncation error.
We achieve surpassing results on CIFAR-10, CelebA-$64 times 64$, and LSUN-Church datasets.
arXiv Detail & Related papers (2024-02-09T15:09:38Z) - Diffusion-Model-Assisted Supervised Learning of Generative Models for
Density Estimation [10.793646707711442]
We present a framework for training generative models for density estimation.
We use the score-based diffusion model to generate labeled data.
Once the labeled data are generated, we can train a simple fully connected neural network to learn the generative model in the supervised manner.
arXiv Detail & Related papers (2023-10-22T23:56:19Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Non-intrusive Nonlinear Model Reduction via Machine Learning
Approximations to Low-dimensional Operators [0.0]
We propose a method that enables traditionally intrusive reduced-order models to be accurately approximated in a non-intrusive manner.
The approach approximates the low-dimensional operators associated with projection-based reduced-order models (ROMs) using modern machine-learning regression techniques.
In addition to enabling nonintrusivity, we demonstrate that the approach also leads to very low computational complexity, achieving up to $1000times$ reduction in run time.
arXiv Detail & Related papers (2021-06-17T17:04:42Z) - Score-Based Generative Modeling through Stochastic Differential
Equations [114.39209003111723]
We present a differential equation that transforms a complex data distribution to a known prior distribution by injecting noise.
A corresponding reverse-time SDE transforms the prior distribution back into the data distribution by slowly removing the noise.
By leveraging advances in score-based generative modeling, we can accurately estimate these scores with neural networks.
We demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.
arXiv Detail & Related papers (2020-11-26T19:39:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.