SoFlow: Solution Flow Models for One-Step Generative Modeling
- URL: http://arxiv.org/abs/2512.15657v1
- Date: Wed, 17 Dec 2025 18:10:17 GMT
- Title: SoFlow: Solution Flow Models for One-Step Generative Modeling
- Authors: Tianze Luo, Haotian Yuan, Zhuang Liu,
- Abstract summary: Flow Models (SoFlow) is a framework for one-step generation from scratch.<n>Flow Matching loss allows our models to provide estimated velocity fields during training.<n>Our models achieve better FID-50K scores than MeanFlow models on the ImageNet 256x256 dataset.
- Score: 10.054000663262618
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The multi-step denoising process in diffusion and Flow Matching models causes major efficiency issues, which motivates research on few-step generation. We present Solution Flow Models (SoFlow), a framework for one-step generation from scratch. By analyzing the relationship between the velocity function and the solution function of the velocity ordinary differential equation (ODE), we propose a Flow Matching loss and a solution consistency loss to train our models. The Flow Matching loss allows our models to provide estimated velocity fields for Classifier-Free Guidance (CFG) during training, which improves generation performance. Notably, our consistency loss does not require the calculation of the Jacobian-vector product (JVP), a common requirement in recent works that is not well-optimized in deep learning frameworks like PyTorch. Experimental results indicate that, when trained from scratch using the same Diffusion Transformer (DiT) architecture and an equal number of training epochs, our models achieve better FID-50K scores than MeanFlow models on the ImageNet 256x256 dataset.
Related papers
- FlowConsist: Make Your Flow Consistent with Real Trajectory [99.22869983378062]
We argue that current fast-flow training paradigms suffer from two fundamental issues.<n> conditional velocities constructed from randomly paired noise-data samples introduce systematic trajectory drift.<n>We propose FlowConsist, a training framework designed to enforce trajectory consistency in fast flows.
arXiv Detail & Related papers (2026-02-06T03:24:23Z) - TwinFlow: Realizing One-step Generation on Large Models with Self-adversarial Flows [25.487712175353035]
We propose TwinFlow, a framework for training 1-step generative models.<n>Our method achieves a GenEval score of 0.83 in 1-NFE on text-to-image tasks.<n>Our approach matches the performance of the original 100-NFE model on GenEval and DPG-Bench benchmarks.
arXiv Detail & Related papers (2025-12-03T07:45:46Z) - Decoupled MeanFlow: Turning Flow Models into Flow Maps for Accelerated Sampling [68.76215229126886]
We introduce Decoupled MeanFlow, a simple decoding strategy that converts flow models into flow map models without architectural modifications.<n>Our method conditions the final blocks of diffusion transformers on the subsequent timestep, allowing pretrained flow models to be directly repurposed as flow maps.<n>On ImageNet 256x256 and 512x512, our models attain 1-step FID of 2.16 and 2.12, respectively, surpassing prior art by a large margin.
arXiv Detail & Related papers (2025-10-28T14:43:48Z) - GLASS Flows: Transition Sampling for Alignment of Flow and Diffusion Models [42.15046750300825]
We introduce GLASS Flows, a new sampling paradigm that simulates a "flow matching model" to sample Markov transitions.<n>On large-scale text-to-image models, we show that GLASS Flows eliminate the trade-off between evolution and efficiency.
arXiv Detail & Related papers (2025-09-29T17:58:36Z) - Mean Flows for One-step Generative Modeling [64.4997821467102]
We propose a principled and effective framework for one-step generative modeling.<n>A well-defined identity between average and instantaneous velocities is derived and used to guide neural network training.<n>Our method, termed the MeanFlow model, is self-contained and requires no pre-training, distillation, or curriculum learning.
arXiv Detail & Related papers (2025-05-19T17:59:42Z) - FlowTS: Time Series Generation via Rectified Flow [67.41208519939626]
FlowTS is an ODE-based model that leverages rectified flow with straight-line transport in probability space.<n>For unconditional setting, FlowTS achieves state-of-the-art performance, with context FID scores of 0.019 and 0.011 on Stock and ETTh datasets.<n>For conditional setting, we have achieved superior performance in solar forecasting.
arXiv Detail & Related papers (2024-11-12T03:03:23Z) - Truncated Consistency Models [57.50243901368328]
Training consistency models requires learning to map all intermediate points along PF ODE trajectories to their corresponding endpoints.<n>We empirically find that this training paradigm limits the one-step generation performance of consistency models.<n>We propose a new parameterization of the consistency function and a two-stage training procedure that prevents the truncated-time training from collapsing to a trivial solution.
arXiv Detail & Related papers (2024-10-18T22:38:08Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Diffusion-Model-Assisted Supervised Learning of Generative Models for
Density Estimation [10.793646707711442]
We present a framework for training generative models for density estimation.
We use the score-based diffusion model to generate labeled data.
Once the labeled data are generated, we can train a simple fully connected neural network to learn the generative model in the supervised manner.
arXiv Detail & Related papers (2023-10-22T23:56:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.