OAT-FM: Optimal Acceleration Transport for Improved Flow Matching
- URL: http://arxiv.org/abs/2509.24936v1
- Date: Mon, 29 Sep 2025 15:36:27 GMT
- Title: OAT-FM: Optimal Acceleration Transport for Improved Flow Matching
- Authors: Angxiao Yue, Anqi Dong, Hongteng Xu,
- Abstract summary: Flow Matching (FM) aims to learn velocity fields from noise to data.<n>We bridge FM and the recent theory of Optimal Acceleration Transport (OAT)<n>We develop an improved FM method called OAT-FM and explore its benefits in both theory and practice.
- Score: 31.693019737039496
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As a powerful technique in generative modeling, Flow Matching (FM) aims to learn velocity fields from noise to data, which is often explained and implemented as solving Optimal Transport (OT) problems. In this study, we bridge FM and the recent theory of Optimal Acceleration Transport (OAT), developing an improved FM method called OAT-FM and exploring its benefits in both theory and practice. In particular, we demonstrate that the straightening objective hidden in existing OT-based FM methods is mathematically equivalent to minimizing the physical action associated with acceleration defined by OAT. Accordingly, instead of enforcing constant velocity, OAT-FM optimizes the acceleration transport in the product space of sample and velocity, whose objective corresponds to a necessary and sufficient condition of flow straightness. An efficient algorithm is designed to achieve OAT-FM with low complexity. OAT-FM motivates a new two-phase FM paradigm: Given a generative model trained by an arbitrary FM method, whose velocity information has been relatively reliable, we can fine-tune and improve it via OAT-FM. This paradigm eliminates the risk of data distribution drift and the need to generate a large number of noise data pairs, which consistently improves model performance in various generative tasks. Code is available at: https://github.com/AngxiaoYue/OAT-FM
Related papers
- Unlocking the Duality between Flow and Field Matching [86.34409966628323]
Conditional Flow Matching (CFM) unifies conventional generative paradigms such as diffusion models and flow matching.<n> Interaction Field Matching (IFM) is a newer framework that generalizes Electrostatic Field Matching (EFM) rooted in Poisson Flow Generative Models (PFGM)<n>We show that they coincide for a natural subclass of IFM that we call forward-only IFM.
arXiv Detail & Related papers (2026-02-02T16:04:01Z) - Diff2Flow: Training Flow Matching Models via Diffusion Model Alignment [22.661660797545164]
Diffusion models have revolutionized generative tasks through high-fidelity outputs, yet flow matching (FM) offers faster inference and empirical performance gains.<n>This work addresses the critical challenge of efficiently transferring knowledge from pre-trained diffusion models to flow matching.<n>We propose Diff2Flow, a novel framework that systematically bridges diffusion and FM paradigms by rescaling timesteps, aligning interpolants, and deriving FM-compatible velocity fields from diffusion predictions.
arXiv Detail & Related papers (2025-06-02T20:05:05Z) - Shallow Flow Matching for Coarse-to-Fine Text-to-Speech Synthesis [31.221799170851142]
Shallow Flow Matching (SFM) is a novel mechanism that enhances flow matching (FM)-based text-to-speech (TTS) models.<n>We show that SFM yields consistent gains in speech naturalness across both objective and subjective evaluations.
arXiv Detail & Related papers (2025-05-18T04:15:08Z) - Local Flow Matching Generative Models [19.859984725284896]
Flow Matching (FM) is a simulation-free method for learning a continuous and invertible flow to interpolate between two distributions.<n>We introduce a stepwise FM model called Local Flow Matching (LFM), which consecutively learns a sequence of FM sub-models.<n>We empirically demonstrate improved training efficiency and competitive generative performance of LFM compared to FM.
arXiv Detail & Related papers (2024-10-03T14:53:10Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - FedPFT: Federated Proxy Fine-Tuning of Foundation Models [55.58899993272904]
Adapting Foundation Models (FMs) for downstream tasks through Federated Learning (FL) emerges as a promising strategy for protecting data privacy and valuable FMs.
Existing methods fine-tune FM by allocating sub-FM to clients in FL, leading to suboptimal performance due to insufficient tuning and inevitable error accumulations of gradients.
We propose Federated Proxy Fine-Tuning (FedPFT), a novel method enhancing FMs adaptation in downstream tasks through FL by two key modules.
arXiv Detail & Related papers (2024-04-17T16:30:06Z) - Optimal Flow Matching: Learning Straight Trajectories in Just One Step [89.37027530300617]
We develop and theoretically justify the novel textbf Optimal Flow Matching (OFM) approach.
It allows recovering the straight OT displacement for the quadratic transport in just one FM step.
The main idea of our approach is the employment of vector field for FM which are parameterized by convex functions.
arXiv Detail & Related papers (2024-03-19T19:44:54Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Mixed Variable Bayesian Optimization with Frequency Modulated Kernels [96.78099706164747]
We propose the frequency modulated (FM) kernel flexibly modeling dependencies among different types of variables.
BO-FM outperforms competitors including Regularized evolution(RE) and BOHB.
arXiv Detail & Related papers (2021-02-25T11:28:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.