Modular MeanFlow: Towards Stable and Scalable One-Step Generative Modeling
- URL: http://arxiv.org/abs/2508.17426v1
- Date: Sun, 24 Aug 2025 16:00:08 GMT
- Title: Modular MeanFlow: Towards Stable and Scalable One-Step Generative Modeling
- Authors: Haochen You, Baojing Liu, Hongyang He,
- Abstract summary: One-step generative modeling seeks to generate high-quality data samples in a single function evaluation.<n>In this work, we introduce Modular MeanFlow, a flexible and theoretically grounded approach for learning time-averaged velocity fields.
- Score: 0.07646713951724012
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: One-step generative modeling seeks to generate high-quality data samples in a single function evaluation, significantly improving efficiency over traditional diffusion or flow-based models. In this work, we introduce Modular MeanFlow (MMF), a flexible and theoretically grounded approach for learning time-averaged velocity fields. Our method derives a family of loss functions based on a differential identity linking instantaneous and average velocities, and incorporates a gradient modulation mechanism that enables stable training without sacrificing expressiveness. We further propose a curriculum-style warmup schedule to smoothly transition from coarse supervision to fully differentiable training. The MMF formulation unifies and generalizes existing consistency-based and flow-matching methods, while avoiding expensive higher-order derivatives. Empirical results across image synthesis and trajectory modeling tasks demonstrate that MMF achieves competitive sample quality, robust convergence, and strong generalization, particularly under low-data or out-of-distribution settings.
Related papers
- Plug, Play, and Fortify: A Low-Cost Module for Robust Multimodal Image Understanding Models [6.350443894942629]
Multimodal Weight Allocation Module (MWAM) is a plug-and-play component that dynamically re-balances the contribution of each branch during training.<n>MWAM delivers consistent performance gains across a wide range of tasks and modality combinations.
arXiv Detail & Related papers (2026-02-26T05:51:41Z) - SMoFi: Step-wise Momentum Fusion for Split Federated Learning on Heterogeneous Data [11.41105795202393]
Split Federated Learning uses rich computing resources at a central server to train model partitions.<n>Data heterogeneity across silos presents a major challenge undermining the convergence speed and accuracy of the global model.<n>This paper introduces Step-wise Momentum Fusion (SMoFi), an effective and lightweight framework that counteracts gradient divergence.
arXiv Detail & Related papers (2025-11-13T00:21:05Z) - SCORENF: Score-based Normalizing Flows for Sampling Unnormalized distributions [5.204468049641428]
We propose ScoreNF, a score-based learning framework built on the Normalizing Flow architecture.<n>We show that ScoreNF maintains high performance even with small training ensembles.<n>We also present a method for assessing mode-covering and mode-collapse behaviours.
arXiv Detail & Related papers (2025-10-24T10:43:19Z) - A-FloPS: Accelerating Diffusion Sampling with Adaptive Flow Path Sampler [21.134678093577193]
A-FloPS is a principled, training-free framework for flow-based generative models.<n>We show that A-FloPS consistently outperforms state-of-the-art training-free samplers in both sample quality and efficiency.<n>With as few as $5$ function evaluations, A-FloPS achieves substantially lower FID and generates sharper, more coherent images.
arXiv Detail & Related papers (2025-08-22T13:28:16Z) - Efficient Federated Learning with Timely Update Dissemination [54.668309196009204]
Federated Learning (FL) has emerged as a compelling methodology for the management of distributed data.<n>We propose an efficient FL approach that capitalizes on additional downlink bandwidth resources to ensure timely update dissemination.
arXiv Detail & Related papers (2025-07-08T14:34:32Z) - Improving Progressive Generation with Decomposable Flow Matching [50.63174319509629]
Decomposable Flow Matching (DFM) is a simple and effective framework for the progressive generation of visual media.<n>On Imagenet-1k 512px, DFM achieves 35.2% improvements in FDD scores over the base architecture and 26.4% over the best-performing baseline.
arXiv Detail & Related papers (2025-06-24T17:58:02Z) - Mean Flows for One-step Generative Modeling [64.4997821467102]
We propose a principled and effective framework for one-step generative modeling.<n>A well-defined identity between average and instantaneous velocities is derived and used to guide neural network training.<n>Our method, termed the MeanFlow model, is self-contained and requires no pre-training, distillation, or curriculum learning.
arXiv Detail & Related papers (2025-05-19T17:59:42Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - Flow map matching with stochastic interpolants: A mathematical framework for consistency models [15.520853806024943]
Flow Map Matching is a principled framework for learning the two-time flow map of an underlying generative model.<n>We show that FMM unifies and extends a broad class of existing approaches for fast sampling.
arXiv Detail & Related papers (2024-06-11T17:41:26Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.