Order-Optimal Sample Complexity of Rectified Flows
- URL: http://arxiv.org/abs/2601.20250v1
- Date: Wed, 28 Jan 2026 04:55:14 GMT
- Title: Order-Optimal Sample Complexity of Rectified Flows
- Authors: Hari Krishna Sahoo, Mudit Gaur, Vaneet Aggarwal,
- Abstract summary: We study rectified flow models, which constrain transport trajectories to be linear from the base distribution to the data distribution.<n>This structural restriction greatly accelerates sampling, often enabling high-quality generation with a single step.
- Score: 43.61958734990224
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, flow-based generative models have shown superior efficiency compared to diffusion models. In this paper, we study rectified flow models, which constrain transport trajectories to be linear from the base distribution to the data distribution. This structural restriction greatly accelerates sampling, often enabling high-quality generation with a single Euler step. Under standard assumptions on the neural network classes used to parameterize the velocity field and data distribution, we prove that rectified flows achieve sample complexity $\tilde{O}(\varepsilon^{-2})$. This improves on the best known $O(\varepsilon^{-4})$ bounds for flow matching model and matches the optimal rate for mean estimation. Our analysis exploits the particular structure of rectified flows: because the model is trained with a squared loss along linear paths, the associated hypothesis class admits a sharply controlled localized Rademacher complexity. This yields the improved, order-optimal sample complexity and provides a theoretical explanation for the strong empirical performance of rectified flow models.
Related papers
- FlowConsist: Make Your Flow Consistent with Real Trajectory [99.22869983378062]
We argue that current fast-flow training paradigms suffer from two fundamental issues.<n> conditional velocities constructed from randomly paired noise-data samples introduce systematic trajectory drift.<n>We propose FlowConsist, a training framework designed to enforce trajectory consistency in fast flows.
arXiv Detail & Related papers (2026-02-06T03:24:23Z) - Generative Modeling with Continuous Flows: Sample Complexity of Flow Matching [60.37045080890305]
We provide the first analysis of the sample complexity for flow-matching based generative models.<n>We decompose the velocity field estimation error into neural-network approximation error, statistical error due to the finite sample size, and optimization error due to the finite number of optimization steps for estimating the velocity field.
arXiv Detail & Related papers (2025-12-01T05:14:25Z) - Theoretical Guarantees for High Order Trajectory Refinement in Generative Flows [40.884514919698596]
Flow matching has emerged as a powerful framework for generative modeling.<n>We prove that higher-order flow matching preserves worst case optimality as a distribution estimator.
arXiv Detail & Related papers (2025-03-12T05:07:07Z) - Elucidating Flow Matching ODE Dynamics with Respect to Data Geometries and Denoisers [10.947094609205765]
Flow matching (FM) models extend ODE sampler based diffusion models into a general framework.<n>A rigorous theoretical analysis of FM models is essential for sample quality, stability, and broader applicability.<n>In this paper, we advance the theory of FM models through a comprehensive analysis of sample trajectories.
arXiv Detail & Related papers (2024-12-25T01:17:15Z) - On the Wasserstein Convergence and Straightness of Rectified Flow [54.580605276017096]
Rectified Flow (RF) is a generative model that aims to learn straight flow trajectories from noise to data.<n>We provide a theoretical analysis of the Wasserstein distance between the sampling distribution of RF and the target distribution.<n>We present general conditions guaranteeing uniqueness and straightness of 1-RF, which is in line with previous empirical findings.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - Characteristic Learning for Provable One Step Generation [12.620728925515012]
We propose a one-step generative model that combines the efficiency of sampling in Generative Adversarial Networks (GANs) with the stable performance of flow-based models.<n>Our model is driven by characteristics, along which the probability density transport can be described by ordinary differential equations (ODEs)<n>A deep neural network is then trained to fit these characteristics, creating a one-step map that pushes a simple Gaussian distribution to the target distribution.
arXiv Detail & Related papers (2024-05-09T02:41:42Z) - Sequential Flow Straightening for Generative Modeling [14.521246785215808]
We propose SeqRF, a learning technique that straightens the probability flow to reduce the global truncation error.
We achieve surpassing results on CIFAR-10, CelebA-$64 times 64$, and LSUN-Church datasets.
arXiv Detail & Related papers (2024-02-09T15:09:38Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.