Low-Dimensional Adaptation of Rectified Flow: A New Perspective through the Lens of Diffusion and Stochastic Localization
- URL: http://arxiv.org/abs/2601.15500v1
- Date: Wed, 21 Jan 2026 22:09:27 GMT
- Title: Low-Dimensional Adaptation of Rectified Flow: A New Perspective through the Lens of Diffusion and Stochastic Localization
- Authors: Saptarshi Roy, Alessandro Rinaldo, Purnamrita Sarkar,
- Abstract summary: Rectified flow (RF) has gained considerable popularity due to its generation efficiency and state-of-the-art performance.<n>In this paper, we investigate the degree to which RF automatically adapts to the intrinsic low dimensionality of the support of the target distribution to accelerate sampling.<n>We show that, using a carefully designed choice of the time-discretization scheme and with sufficiently accurate drift estimates, the RF sampler enjoys an complexity of order $O(k/varepsilon)$.
- Score: 59.04314685837778
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In recent years, Rectified flow (RF) has gained considerable popularity largely due to its generation efficiency and state-of-the-art performance. In this paper, we investigate the degree to which RF automatically adapts to the intrinsic low dimensionality of the support of the target distribution to accelerate sampling. We show that, using a carefully designed choice of the time-discretization scheme and with sufficiently accurate drift estimates, the RF sampler enjoys an iteration complexity of order $O(k/\varepsilon)$ (up to log factors), where $\varepsilon$ is the precision in total variation distance and $k$ is the intrinsic dimension of the target distribution. In addition, we show that the denoising diffusion probabilistic model (DDPM) procedure is equivalent to a stochastic version of RF by establishing a novel connection between these processes and stochastic localization. Building on this connection, we further design a stochastic RF sampler that also adapts to the low-dimensionality of the target distribution under milder requirements on the accuracy of the drift estimates, and also with a specific time schedule. We illustrate with simulations on the synthetic data and text-to-image data experiments the improved performance of the proposed samplers implementing the newly designed time-discretization schedules.
Related papers
- Efficient Real-Time Adaptation of ROMs for Unsteady Flows Using Data Assimilation [7.958594167693376]
We propose an efficient retraining strategy for a parameterized Reduced Order Model (ROM)<n>The strategy attains accuracy comparable to full retraining while requiring only a fraction of the computational time.<n>We show that, for the dynamical system considered, the dominant source of error in out-of-sample forecasts stems from distortions of the latent manifold.
arXiv Detail & Related papers (2026-02-26T16:43:28Z) - Efficient Sampling with Discrete Diffusion Models: Sharp and Adaptive Guarantees [9.180350432640912]
We study the sampling efficiency of score-based discrete diffusion models under a continuous-time Markov chain (CTMC) formulation.<n>For uniform discrete diffusion, we show that the $$-leaping algorithm achieves an complexity of order $tilde O(d/varepsilon)$.<n>For masking discrete diffusion, we introduce a modified $$-leaping sampler whose convergence rate is governed by an intrinsic information-theoretic quantity.
arXiv Detail & Related papers (2026-02-16T18:48:17Z) - Fast Sampling for Flows and Diffusions with Lazy and Point Mass Stochastic Interpolants [5.492889521988414]
We prove how to convert a sample path of a differential equation (SDE) with arbitrary diffusion coefficient under any schedule.<n>We then extend the interpolant framework to admit a larger class of point mass schedules.
arXiv Detail & Related papers (2026-02-03T17:48:34Z) - Trajectory Consistency for One-Step Generation on Euler Mean Flows [24.038760671907024]
We propose emphEuler Mean Flows (EMF), a flow-based generative framework for one-step and few-step generation.<n>EMF enforces long-range trajectory consistency with minimal sampling cost.
arXiv Detail & Related papers (2026-01-31T04:32:32Z) - FreqFlow: Long-term forecasting using lightweight flow matching [3.5235875824926346]
We introduce FreqFlow, a novel framework that leverages conditional flow matching in the frequency domain for deterministic MTS forecasting.<n>FreqFlow transforms the forecasting problem into the spectral domain, where it learns to model amplitude and phase shifts.<n>Experiments on real-world traffic speed, volume, and flow datasets demonstrate that FreqFlow achieves state-of-the-art forecasting performance.
arXiv Detail & Related papers (2025-11-20T14:50:13Z) - A-FloPS: Accelerating Diffusion Sampling with Adaptive Flow Path Sampler [21.134678093577193]
A-FloPS is a principled, training-free framework for flow-based generative models.<n>We show that A-FloPS consistently outperforms state-of-the-art training-free samplers in both sample quality and efficiency.<n>With as few as $5$ function evaluations, A-FloPS achieves substantially lower FID and generates sharper, more coherent images.
arXiv Detail & Related papers (2025-08-22T13:28:16Z) - Inference-Time Scaling of Diffusion Language Models with Particle Gibbs Sampling [70.8832906871441]
We study how to steer generation toward desired rewards without retraining the models.<n>Prior methods typically resample or filter within a single denoising trajectory, optimizing rewards step-by-step without trajectory-level refinement.<n>We introduce particle Gibbs sampling for diffusion language models (PG-DLM), a novel inference-time algorithm enabling trajectory-level refinement while preserving generation perplexity.
arXiv Detail & Related papers (2025-07-11T08:00:47Z) - Adaptive Deadline and Batch Layered Synchronized Federated Learning [66.93447103966439]
Federated learning (FL) enables collaborative model training across distributed edge devices while preserving data privacy, and typically operates in a round-based synchronous manner.<n>We propose ADEL-FL, a novel framework that jointly optimize per-round deadlines and user-specific batch sizes for layer-wise aggregation.
arXiv Detail & Related papers (2025-05-29T19:59:18Z) - Score-Optimal Diffusion Schedules [29.062842062257918]
An appropriate discretisation schedule is crucial to obtain high quality samples.<n>This paper presents a novel algorithm for adaptively selecting an optimal discretisation schedule.<n>We find that our learned schedule recovers performant schedules previously only discovered through manual search.
arXiv Detail & Related papers (2024-12-10T19:26:51Z) - Denoising diffusion probabilistic models are optimally adaptive to unknown low dimensionality [21.10158431913811]
We investigate how the DDPM can achieve sampling speed-ups through automatic exploitation of intrinsic low dimensionality of data.
We prove that the iteration complexity of the DDPM scales nearly linearly with $k$, which is optimal when using KL divergence to measure distributional discrepancy.
arXiv Detail & Related papers (2024-10-24T14:36:12Z) - On the Wasserstein Convergence and Straightness of Rectified Flow [54.580605276017096]
Rectified Flow (RF) is a generative model that aims to learn straight flow trajectories from noise to data.<n>We provide a theoretical analysis of the Wasserstein distance between the sampling distribution of RF and the target distribution.<n>We present general conditions guaranteeing uniqueness and straightness of 1-RF, which is in line with previous empirical findings.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - A Simple Early Exiting Framework for Accelerated Sampling in Diffusion Models [14.859580045688487]
A practical bottleneck of diffusion models is their sampling speed.
We propose a novel framework capable of adaptively allocating compute required for the score estimation.
We show that our method could significantly improve the sampling throughput of the diffusion models without compromising image quality.
arXiv Detail & Related papers (2024-08-12T05:33:45Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.