SCoT: Unifying Consistency Models and Rectified Flows via Straight-Consistent Trajectories
- URL: http://arxiv.org/abs/2502.16972v4
- Date: Thu, 02 Oct 2025 10:39:18 GMT
- Title: SCoT: Unifying Consistency Models and Rectified Flows via Straight-Consistent Trajectories
- Authors: Zhangkai Wu, Xuhui Fan, Hongyu Wu, Longbing Cao,
- Abstract summary: We propose a Straight Consistent Trajectory(SCoT) model for pre-trained diffusion models.<n>SCoT enjoys the benefits of both approaches for fast sampling, producing trajectories with consistent and straight properties simultaneously.
- Score: 31.60548236936739
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Pre-trained diffusion models are commonly used to generate clean data (e.g., images) from random noises, effectively forming pairs of noises and corresponding clean images. Distillation on these pre-trained models can be viewed as the process of constructing advanced trajectories within the pair to accelerate sampling. For instance, consistency model distillation develops consistent projection functions to regulate trajectories, although sampling efficiency remains a concern. Rectified flow method enforces straight trajectories to enable faster sampling, yet relies on numerical ODE solvers, which may introduce approximation errors. In this work, we bridge the gap between the consistency model and the rectified flow method by proposing a Straight Consistent Trajectory~(SCoT) model. SCoT enjoys the benefits of both approaches for fast sampling, producing trajectories with consistent and straight properties simultaneously. These dual properties are strategically balanced by targeting two critical objectives: (1) regulating the gradient of SCoT's mapping to a constant, (2) ensuring trajectory consistency. Extensive experimental results demonstrate the effectiveness and efficiency of SCoT.
Related papers
- FlowConsist: Make Your Flow Consistent with Real Trajectory [99.22869983378062]
We argue that current fast-flow training paradigms suffer from two fundamental issues.<n> conditional velocities constructed from randomly paired noise-data samples introduce systematic trajectory drift.<n>We propose FlowConsist, a training framework designed to enforce trajectory consistency in fast flows.
arXiv Detail & Related papers (2026-02-06T03:24:23Z) - Path-Guided Flow Matching for Dataset Distillation [9.761850986508895]
We propose the first flow matching-based framework for generative distillation, which enables fast deterministic synthesis by solving an ODE in a few steps.<n>We develop a continuous path-to-prototype guidance algorithm for ODE-consistent path control, which allows trajectories to reliably land on assigned prototypes.
arXiv Detail & Related papers (2026-02-05T12:52:32Z) - Temporal Pair Consistency for Variance-Reduced Flow Matching [13.328987133593154]
Temporal Pair Consistency (TPC) is a lightweight variance-reduction principle that couples velocity predictions at paired timesteps along the same probability path.<n>Instantiated within flow matching, TPC improves sample quality and efficiency across CIFAR-10 and ImageNet at multiple resolutions.
arXiv Detail & Related papers (2026-02-04T00:05:21Z) - Score Distillation of Flow Matching Models [67.86066177182046]
We extend Score identity Distillation (SiD) to pretrained text-to-image flow-matching models.<n>SiD works out of the box across these models, in both data-free and data-aided settings.<n>This provides the first systematic evidence that score distillation applies broadly to text-to-image flow matching models.
arXiv Detail & Related papers (2025-09-29T17:45:48Z) - A-FloPS: Accelerating Diffusion Sampling with Adaptive Flow Path Sampler [21.134678093577193]
A-FloPS is a principled, training-free framework for flow-based generative models.<n>We show that A-FloPS consistently outperforms state-of-the-art training-free samplers in both sample quality and efficiency.<n>With as few as $5$ function evaluations, A-FloPS achieves substantially lower FID and generates sharper, more coherent images.
arXiv Detail & Related papers (2025-08-22T13:28:16Z) - Align Your Flow: Scaling Continuous-Time Flow Map Distillation [63.927438959502226]
Flow maps connect any two noise levels in a single step and remain effective across all step counts.<n>We extensively validate our flow map models, called Align Your Flow, on challenging image generation benchmarks.<n>We show text-to-image flow map models that outperform all existing non-adversarially trained few-step samplers in text-conditioned synthesis.
arXiv Detail & Related papers (2025-06-17T15:06:07Z) - Toward Theoretical Insights into Diffusion Trajectory Distillation via Operator Merging [10.315743300140966]
Diffusion trajectory distillation aims to accelerate sampling in diffusion models that produce high-quality outputs but suffer from slow sampling speeds.<n>We propose a programming algorithm to compute the optimal merging strategy that maximally preserves signal fidelity.<n>Our findings enhance the theoretical understanding of diffusion trajectory distillation and offer practical insights for improving distillation strategies.
arXiv Detail & Related papers (2025-05-21T21:13:02Z) - ProReflow: Progressive Reflow with Decomposed Velocity [52.249464542399636]
Flow matching aims to reflow the diffusion process of diffusion models into a straight line for a few-step and even one-step generation.
We introduce progressive reflow, which progressively reflows the diffusion models in local timesteps until the whole diffusion progresses.
We also introduce aligned v-prediction, which highlights the importance of direction matching in flow matching over magnitude matching.
arXiv Detail & Related papers (2025-03-05T04:50:53Z) - Target-driven Self-Distillation for Partial Observed Trajectories Forecasting [41.636125879090116]
We introduce a Target-driven Self-Distillation method (TSD) for motion forecasting.<n>By employing self-distillation, the model learns from the feature distributions of both fully observed and partially observed trajectories.<n>This enhances the model's ability to predict motion accurately in both fully observed and partially observed scenarios.
arXiv Detail & Related papers (2025-01-28T07:46:13Z) - Self-Corrected Flow Distillation for Consistent One-Step and Few-Step Text-to-Image Generation [3.8959351616076745]
Flow matching has emerged as a promising framework for training generative models.
We introduce a self-corrected flow distillation method that integrates consistency models and adversarial training.
This work is a pioneer in achieving consistent generation quality in both few-step and one-step sampling.
arXiv Detail & Related papers (2024-12-22T07:48:49Z) - Inference-Time Diffusion Model Distillation [59.350789627086456]
We introduce Distillation++, a novel inference-time distillation framework.<n>Inspired by recent advances in conditional sampling, our approach recasts student model sampling as a proximal optimization problem.<n>We integrate distillation optimization during reverse sampling, which can be viewed as teacher guidance.
arXiv Detail & Related papers (2024-12-12T02:07:17Z) - Self-Refining Diffusion Samplers: Enabling Parallelization via Parareal Iterations [53.180374639531145]
Self-Refining Diffusion Samplers (SRDS) retain sample quality and can improve latency at the cost of additional parallel compute.
We take inspiration from the Parareal algorithm, a popular numerical method for parallel-in-time integration of differential equations.
arXiv Detail & Related papers (2024-12-11T11:08:09Z) - 2-Rectifications are Enough for Straight Flows: A Theoretical Insight into Wasserstein Convergence [54.580605276017096]
We provide the first theoretical analysis of the Wasserstein distance between the sampling distribution of Rectified Flow and the target distribution.<n>We show that for a rectified flow from a Gaussian to any general target distribution with finite first moment, two rectifications are sufficient to achieve a straight flow.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - Rectified Diffusion: Straightness Is Not Your Need in Rectified Flow [65.51671121528858]
Diffusion models have greatly improved visual generation but are hindered by slow generation speed due to the computationally intensive nature of solving generative ODEs.
Rectified flow, a widely recognized solution, improves generation speed by straightening the ODE path.
We propose Rectified Diffusion, which generalizes the design space and application scope of rectification to encompass the broader category of diffusion models.
arXiv Detail & Related papers (2024-10-09T17:43:38Z) - Target-Driven Distillation: Consistency Distillation with Target Timestep Selection and Decoupled Guidance [17.826285840875556]
We introduce Target-Driven Distillation (TDD) to accelerate generative tasks of diffusion models.
TDD adopts delicate selection strategy of target timesteps, increasing the training efficiency.
It can be equipped with non-equidistant sampling and x0 clipping, enabling a more flexible and accurate way for image sampling.
arXiv Detail & Related papers (2024-09-02T16:01:38Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - Improving Consistency Models with Generator-Augmented Flows [16.049476783301724]
Consistency models imitate the multi-step sampling of score-based diffusion in a single forward pass of a neural network.<n>They can be learned in two ways: consistency distillation and consistency training.<n>We propose a novel flow that transports noisy data towards their corresponding outputs derived from a consistency model.
arXiv Detail & Related papers (2024-06-13T20:22:38Z) - Flow Map Matching [15.520853806024943]
Flow map matching is an algorithm that learns the two-time flow map of an underlying ordinary differential equation.
We show that flow map matching leads to high-quality samples with significantly reduced sampling cost compared to diffusion or interpolant methods.
arXiv Detail & Related papers (2024-06-11T17:41:26Z) - Optimal Flow Matching: Learning Straight Trajectories in Just One Step [89.37027530300617]
We develop and theoretically justify the novel textbf Optimal Flow Matching (OFM) approach.
It allows recovering the straight OT displacement for the quadratic transport in just one FM step.
The main idea of our approach is the employment of vector field for FM which are parameterized by convex functions.
arXiv Detail & Related papers (2024-03-19T19:44:54Z) - Trajectory Consistency Distillation: Improved Latent Consistency Distillation by Semi-Linear Consistency Function with Trajectory Mapping [75.72212215739746]
Trajectory Consistency Distillation (TCD) encompasses trajectory consistency function and strategic sampling.
TCD significantly enhances image quality at low NFEs but also yields more detailed results compared to the teacher model.
arXiv Detail & Related papers (2024-02-29T13:44:14Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Haar Wavelet based Block Autoregressive Flows for Trajectories [129.37479472754083]
Prediction of trajectories such as that of pedestrians is crucial to the performance of autonomous agents.
We introduce a novel Haar wavelet based block autoregressive model leveraging split couplings.
We illustrate the advantages of our approach for generating diverse and accurate trajectories on two real-world datasets.
arXiv Detail & Related papers (2020-09-21T13:57:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.