TraFlow: Trajectory Distillation on Pre-Trained Rectified Flow
- URL: http://arxiv.org/abs/2502.16972v1
- Date: Mon, 24 Feb 2025 08:57:19 GMT
- Title: TraFlow: Trajectory Distillation on Pre-Trained Rectified Flow
- Authors: Zhangkai Wu, Xuhui Fan, Hongyu Wu, Longbing Cao,
- Abstract summary: We propose a trajectory distillation method, modelname, that enjoys the benefits of both and enables few-step generations.<n>TraFlow adopts the settings of consistency trajectory models, and further enforces the properties of self-consistency and straightness throughout the entire trajectory.
- Score: 31.56008127287467
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Majorities of distillation methods on pre-trained diffusion models or on pre-trained rectified flow, focus on either the distillation outputs or the trajectories between random noises and clean images to speed up sample generations from pre-trained models. In those trajectory-based distillation methods, consistency distillation requires the self-consistent trajectory projection to regulate the trajectory, which might avoid the common ODE approximation error {while still be concerning about sampling efficiencies}. At the same time, rectified flow distillations enforce straight trajectory for fast sampling, although an ODE solver is still required. In this work, we propose a trajectory distillation method, \modelname, that enjoys the benefits of both and enables few-step generations. TraFlow adopts the settings of consistency trajectory models, and further enforces the properties of self-consistency and straightness throughout the entire trajectory. These two properties are pursued by reaching a balance with following three targets: (1) reconstruct the output from pre-trained models; (2) learn the amount of changes by pre-trained models; (3) satisfy the self-consistency over its trajectory. Extensive experimental results have shown the effectiveness of our proposed method.
Related papers
- ProReflow: Progressive Reflow with Decomposed Velocity [52.249464542399636]
Flow matching aims to reflow the diffusion process of diffusion models into a straight line for a few-step and even one-step generation.
We introduce progressive reflow, which progressively reflows the diffusion models in local timesteps until the whole diffusion progresses.
We also introduce aligned v-prediction, which highlights the importance of direction matching in flow matching over magnitude matching.
arXiv Detail & Related papers (2025-03-05T04:50:53Z) - Target-driven Self-Distillation for Partial Observed Trajectories Forecasting [41.636125879090116]
We introduce a Target-driven Self-Distillation method (TSD) for motion forecasting.<n>By employing self-distillation, the model learns from the feature distributions of both fully observed and partially observed trajectories.<n>This enhances the model's ability to predict motion accurately in both fully observed and partially observed scenarios.
arXiv Detail & Related papers (2025-01-28T07:46:13Z) - Self-Corrected Flow Distillation for Consistent One-Step and Few-Step Text-to-Image Generation [3.8959351616076745]
Flow matching has emerged as a promising framework for training generative models.
We introduce a self-corrected flow distillation method that integrates consistency models and adversarial training.
This work is a pioneer in achieving consistent generation quality in both few-step and one-step sampling.
arXiv Detail & Related papers (2024-12-22T07:48:49Z) - Inference-Time Diffusion Model Distillation [59.350789627086456]
We introduce Distillation++, a novel inference-time distillation framework.<n>Inspired by recent advances in conditional sampling, our approach recasts student model sampling as a proximal optimization problem.<n>We integrate distillation optimization during reverse sampling, which can be viewed as teacher guidance.
arXiv Detail & Related papers (2024-12-12T02:07:17Z) - Self-Refining Diffusion Samplers: Enabling Parallelization via Parareal Iterations [53.180374639531145]
Self-Refining Diffusion Samplers (SRDS) retain sample quality and can improve latency at the cost of additional parallel compute.
We take inspiration from the Parareal algorithm, a popular numerical method for parallel-in-time integration of differential equations.
arXiv Detail & Related papers (2024-12-11T11:08:09Z) - 2-Rectifications are Enough for Straight Flows: A Theoretical Insight into Wasserstein Convergence [54.580605276017096]
We provide the first theoretical analysis of the Wasserstein distance between the sampling distribution of Rectified Flow and the target distribution.<n>We show that for a rectified flow from a Gaussian to any general target distribution with finite first moment, two rectifications are sufficient to achieve a straight flow.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - Rectified Diffusion: Straightness Is Not Your Need in Rectified Flow [65.51671121528858]
Diffusion models have greatly improved visual generation but are hindered by slow generation speed due to the computationally intensive nature of solving generative ODEs.
Rectified flow, a widely recognized solution, improves generation speed by straightening the ODE path.
We propose Rectified Diffusion, which generalizes the design space and application scope of rectification to encompass the broader category of diffusion models.
arXiv Detail & Related papers (2024-10-09T17:43:38Z) - Target-Driven Distillation: Consistency Distillation with Target Timestep Selection and Decoupled Guidance [17.826285840875556]
We introduce Target-Driven Distillation (TDD) to accelerate generative tasks of diffusion models.
TDD adopts delicate selection strategy of target timesteps, increasing the training efficiency.
It can be equipped with non-equidistant sampling and x0 clipping, enabling a more flexible and accurate way for image sampling.
arXiv Detail & Related papers (2024-09-02T16:01:38Z) - Improving Consistency Models with Generator-Augmented Flows [16.049476783301724]
Consistency models imitate the multi-step sampling of score-based diffusion in a single forward pass of a neural network.<n>They can be learned in two ways: consistency distillation and consistency training.<n>We propose a novel flow that transports noisy data towards their corresponding outputs derived from a consistency model.
arXiv Detail & Related papers (2024-06-13T20:22:38Z) - Flow Map Matching [15.520853806024943]
Flow map matching is an algorithm that learns the two-time flow map of an underlying ordinary differential equation.
We show that flow map matching leads to high-quality samples with significantly reduced sampling cost compared to diffusion or interpolant methods.
arXiv Detail & Related papers (2024-06-11T17:41:26Z) - Optimal Flow Matching: Learning Straight Trajectories in Just One Step [89.37027530300617]
We develop and theoretically justify the novel textbf Optimal Flow Matching (OFM) approach.
It allows recovering the straight OT displacement for the quadratic transport in just one FM step.
The main idea of our approach is the employment of vector field for FM which are parameterized by convex functions.
arXiv Detail & Related papers (2024-03-19T19:44:54Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - Haar Wavelet based Block Autoregressive Flows for Trajectories [129.37479472754083]
Prediction of trajectories such as that of pedestrians is crucial to the performance of autonomous agents.
We introduce a novel Haar wavelet based block autoregressive model leveraging split couplings.
We illustrate the advantages of our approach for generating diverse and accurate trajectories on two real-world datasets.
arXiv Detail & Related papers (2020-09-21T13:57:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.