Unlocking the Duality between Flow and Field Matching
- URL: http://arxiv.org/abs/2602.02261v1
- Date: Mon, 02 Feb 2026 16:04:01 GMT
- Title: Unlocking the Duality between Flow and Field Matching
- Authors: Daniil Shlenskii, Alexander Varlamov, Nazar Buzun, Alexander Korotin,
- Abstract summary: Conditional Flow Matching (CFM) unifies conventional generative paradigms such as diffusion models and flow matching.<n> Interaction Field Matching (IFM) is a newer framework that generalizes Electrostatic Field Matching (EFM) rooted in Poisson Flow Generative Models (PFGM)<n>We show that they coincide for a natural subclass of IFM that we call forward-only IFM.
- Score: 86.34409966628323
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Conditional Flow Matching (CFM) unifies conventional generative paradigms such as diffusion models and flow matching. Interaction Field Matching (IFM) is a newer framework that generalizes Electrostatic Field Matching (EFM) rooted in Poisson Flow Generative Models (PFGM). While both frameworks define generative dynamics, they start from different objects: CFM specifies a conditional probability path in data space, whereas IFM specifies a physics-inspired interaction field in an augmented data space. This raises a basic question: are CFM and IFM genuinely different, or are they two descriptions of the same underlying dynamics? We show that they coincide for a natural subclass of IFM that we call forward-only IFM. Specifically, we construct a bijection between CFM and forward-only IFM. We further show that general IFM is strictly more expressive: it includes EFM and other interaction fields that cannot be realized within the standard CFM formulation. Finally, we highlight how this duality can benefit both frameworks: it provides a probabilistic interpretation of forward-only IFM and yields novel, IFM-driven techniques for CFM.
Related papers
- OAT-FM: Optimal Acceleration Transport for Improved Flow Matching [31.693019737039496]
Flow Matching (FM) aims to learn velocity fields from noise to data.<n>We bridge FM and the recent theory of Optimal Acceleration Transport (OAT)<n>We develop an improved FM method called OAT-FM and explore its benefits in both theory and practice.
arXiv Detail & Related papers (2025-09-29T15:36:27Z) - Hierarchical Federated Foundation Models over Wireless Networks for Multi-Modal Multi-Task Intelligence: Integration of Edge Learning with D2D/P2P-Enabled Fog Learning Architectures [58.72593025539547]
In this paper, we unveil an unexplored variation of M3T FFMs by proposing hierarchical federated foundation models (HF-FMs)<n>HF-FMs strategically align the modular structure of M3T FMs, comprising modality encoders, prompts, mixture-of-experts (MoEs), adapters, and task heads.<n>To demonstrate their potential, we prototype HF-FMs in a wireless network setting and release the open-source code for the development of HF-FMs.
arXiv Detail & Related papers (2025-09-03T20:23:19Z) - Shallow Flow Matching for Coarse-to-Fine Text-to-Speech Synthesis [31.221799170851142]
Shallow Flow Matching (SFM) is a novel mechanism that enhances flow matching (FM)-based text-to-speech (TTS) models.<n>We show that SFM yields consistent gains in speech naturalness across both objective and subjective evaluations.
arXiv Detail & Related papers (2025-05-18T04:15:08Z) - DFM: Interpolant-free Dual Flow Matching [0.8192907805418583]
We propose an interpolant-free dual flow matching (DFM) approach without explicit assumptions about the modeled vector field.
Experiments with the SMAP unsupervised anomaly detection show advantages of DFM when compared to the CNF trained with either maximum likelihood or FM objectives.
arXiv Detail & Related papers (2024-10-11T20:46:04Z) - Local Flow Matching Generative Models [19.859984725284896]
Flow Matching (FM) is a simulation-free method for learning a continuous and invertible flow to interpolate between two distributions.<n>We introduce a stepwise FM model called Local Flow Matching (LFM), which consecutively learns a sequence of FM sub-models.<n>We empirically demonstrate improved training efficiency and competitive generative performance of LFM compared to FM.
arXiv Detail & Related papers (2024-10-03T14:53:10Z) - FedPFT: Federated Proxy Fine-Tuning of Foundation Models [55.58899993272904]
Adapting Foundation Models (FMs) for downstream tasks through Federated Learning (FL) emerges as a promising strategy for protecting data privacy and valuable FMs.
Existing methods fine-tune FM by allocating sub-FM to clients in FL, leading to suboptimal performance due to insufficient tuning and inevitable error accumulations of gradients.
We propose Federated Proxy Fine-Tuning (FedPFT), a novel method enhancing FMs adaptation in downstream tasks through FL by two key modules.
arXiv Detail & Related papers (2024-04-17T16:30:06Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Boosting Factorization Machines via Saliency-Guided Mixup [125.15872106335692]
We present MixFM, inspired by Mixup, to generate auxiliary training data to boost Factorization machines (FMs)
We also put forward a novel Factorization Machine powered by Saliency-guided Mixup (denoted as SMFM)
arXiv Detail & Related papers (2022-06-17T09:49:00Z) - Mixed Variable Bayesian Optimization with Frequency Modulated Kernels [96.78099706164747]
We propose the frequency modulated (FM) kernel flexibly modeling dependencies among different types of variables.
BO-FM outperforms competitors including Regularized evolution(RE) and BOHB.
arXiv Detail & Related papers (2021-02-25T11:28:46Z) - $FM^2$: Field-matrixed Factorization Machines for Recommender Systems [9.461169933697379]
We propose a novel approach to model the field information effectively and efficiently.
The proposed approach is a direct improvement of FwFM, and is named as Field-matrixed Factorization Machines (FmFM)
arXiv Detail & Related papers (2021-02-20T00:03:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.