CAR-Flow: Condition-Aware Reparameterization Aligns Source and Target for Better Flow Matching
- URL: http://arxiv.org/abs/2509.19300v2
- Date: Thu, 23 Oct 2025 20:16:25 GMT
- Title: CAR-Flow: Condition-Aware Reparameterization Aligns Source and Target for Better Flow Matching
- Authors: Chen Chen, Pengsheng Guo, Liangchen Song, Jiasen Lu, Rui Qian, Xinze Wang, Tsu-Jui Fu, Wei Liu, Yinfei Yang, Alex Schwing,
- Abstract summary: Conditional generative modeling aims to learn a conditional data distribution from samples containing data-condition pairs.<n>We propose Condition-Aware Re parameterization for Flow Matching (CAR-Flow) to ease the demand on the model.<n>CAR-Flow shortens the probability path the model must learn, leading to faster training in practice.
- Score: 32.53595655791657
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conditional generative modeling aims to learn a conditional data distribution from samples containing data-condition pairs. For this, diffusion and flow-based methods have attained compelling results. These methods use a learned (flow) model to transport an initial standard Gaussian noise that ignores the condition to the conditional data distribution. The model is hence required to learn both mass transport and conditional injection. To ease the demand on the model, we propose Condition-Aware Reparameterization for Flow Matching (CAR-Flow) -- a lightweight, learned shift that conditions the source, the target, or both distributions. By relocating these distributions, CAR-Flow shortens the probability path the model must learn, leading to faster training in practice. On low-dimensional synthetic data, we visualize and quantify the effects of CAR-Flow. On higher-dimensional natural image data (ImageNet-256), equipping SiT-XL/2 with CAR-Flow reduces FID from 2.07 to 1.68, while introducing less than 0.6% additional parameters.
Related papers
- Flow Matching Neural Processes [2.3020018305241337]
We introduce a new NP model based on flow matching, a generative modeling paradigm that has demonstrated strong performance on various data modalities.<n>Compared to previous NP models, our model is simple to implement and can be used to sample from conditional distributions using an ODE solver.<n>We show that our model outperforms previous state-of-the-art neural process methods on various benchmarks including synthetic 1D Gaussian processes data, 2D images, and real-world weather data.
arXiv Detail & Related papers (2025-12-29T20:37:29Z) - PairFlow: Closed-Form Source-Target Coupling for Few-Step Generation in Discrete Flow Models [24.911180095603658]
$texttPairFlow$ is a lightweight preprocessing step for training Discrete Flow Models (DFMs)<n>DFMs suffer from slow sampling due to their iterative nature.<n>$texttPairFlow$ matches or even surpasses the performance of two-stage training involving finetuning.
arXiv Detail & Related papers (2025-12-23T05:31:56Z) - Joint Distillation for Fast Likelihood Evaluation and Sampling in Flow-based Models [100.28111930893188]
Some of today's best generative models still require hundreds to thousands of neural function evaluations to compute a single likelihood.<n>We present fast flow joint distillation (F2D2), a framework that simultaneously reduces the number of NFEs required for both sampling and likelihood evaluation by two orders of magnitude.<n>F2D2 is modular, compatible with existing flow-based few-step sampling models, and requires only an additional divergence prediction head.
arXiv Detail & Related papers (2025-12-02T10:48:20Z) - Contrastive Flow Matching [61.60002028726023]
We introduce Contrastive Flow Matching, an extension to the flow matching objective that explicitly enforces uniqueness across all conditional flows.<n>Our approach adds a contrastive objective that maximizes dissimilarities between predicted flows from arbitrary sample pairs.<n>We find that training models with Contrastive Flow Matching (1) improves training speed by a factor of up to 9x, (2) requires up to 5x fewer de-noising steps and (3) lowers FID by up to 8.9 compared to training the same models with flow matching.
arXiv Detail & Related papers (2025-06-05T17:59:58Z) - FlowTS: Time Series Generation via Rectified Flow [67.41208519939626]
FlowTS is an ODE-based model that leverages rectified flow with straight-line transport in probability space.<n>For unconditional setting, FlowTS achieves state-of-the-art performance, with context FID scores of 0.019 and 0.011 on Stock and ETTh datasets.<n>For conditional setting, we have achieved superior performance in solar forecasting.
arXiv Detail & Related papers (2024-11-12T03:03:23Z) - Flow Matching with Gaussian Process Priors for Probabilistic Time Series Forecasting [43.951394031702016]
We introduce TSFlow, a conditional flow matching (CFM) model for time series combining Gaussian processes, optimal transport paths, and data-dependent prior distributions.<n>We show that both conditionally and unconditionally trained models achieve competitive results across multiple forecasting benchmarks.
arXiv Detail & Related papers (2024-10-03T22:12:50Z) - Local Flow Matching Generative Models [19.859984725284896]
Flow Matching (FM) is a simulation-free method for learning a continuous and invertible flow to interpolate between two distributions.<n>We introduce a stepwise FM model called Local Flow Matching (LFM), which consecutively learns a sequence of FM sub-models.<n>We empirically demonstrate improved training efficiency and competitive generative performance of LFM compared to FM.
arXiv Detail & Related papers (2024-10-03T14:53:10Z) - PaddingFlow: Improving Normalizing Flows with Padding-Dimensional Noise [4.762593660623934]
We propose PaddingFlow, a novel dequantization method, which improves normalizing flows with padding-dimensional noise.
We validate our method on the main benchmarks of unconditional density estimation.
The results show that PaddingFlow can perform better in all experiments in this paper.
arXiv Detail & Related papers (2024-03-13T03:28:39Z) - Debias Coarsely, Sample Conditionally: Statistical Downscaling through
Optimal Transport and Probabilistic Diffusion Models [15.623456909553786]
We introduce a two-stage probabilistic framework for statistical downscaling using unpaired data.
We demonstrate the utility of the proposed approach on one- and two-dimensional fluid flow problems.
arXiv Detail & Related papers (2023-05-24T23:40:23Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - TrafficFlowGAN: Physics-informed Flow based Generative Adversarial
Network for Uncertainty Quantification [4.215251065887861]
We propose TrafficFlowGAN, a physics-informed flow based generative adversarial network (GAN) for uncertainty quantification (UQ) of dynamical systems.
This flow model is trained to maximize the data likelihood and to generate synthetic data that can fool a convolutional discriminator.
To the best of our knowledge, we are the first to propose an integration of flow, GAN and PIDL for the UQ problems.
arXiv Detail & Related papers (2022-06-19T03:35:12Z) - AutoFlow: Learning a Better Training Set for Optical Flow [62.40293188964933]
AutoFlow is a method to render training data for optical flow.
AutoFlow achieves state-of-the-art accuracy in pre-training both PWC-Net and RAFT.
arXiv Detail & Related papers (2021-04-29T17:55:23Z) - DeFlow: Learning Complex Image Degradations from Unpaired Data with
Conditional Flows [145.83812019515818]
We propose DeFlow, a method for learning image degradations from unpaired data.
We model the degradation process in the latent space of a shared flow-decoder network.
We validate our DeFlow formulation on the task of joint image restoration and super-resolution.
arXiv Detail & Related papers (2021-01-14T18:58:01Z) - Autoregressive Score Matching [113.4502004812927]
We propose autoregressive conditional score models (AR-CSM) where we parameterize the joint distribution in terms of the derivatives of univariable log-conditionals (scores)
For AR-CSM models, this divergence between data and model distributions can be computed and optimized efficiently, requiring no expensive sampling or adversarial training.
We show with extensive experimental results that it can be applied to density estimation on synthetic data, image generation, image denoising, and training latent variable models with implicit encoders.
arXiv Detail & Related papers (2020-10-24T07:01:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.