Categorical Flow Maps
- URL: http://arxiv.org/abs/2602.12233v1
- Date: Thu, 12 Feb 2026 18:10:46 GMT
- Title: Categorical Flow Maps
- Authors: Daan Roos, Oscar Davis, Floor Eijkelboom, Michael Bronstein, Max Welling, İsmail İlkan Ceylan, Luca Ambrogioni, Jan-Willem van de Meent,
- Abstract summary: Categorical Flow Maps is a flow-matching method for accelerated few-step generation of categorical data via self-distillation.<n>We achieve state-of-the-art few-step results on images, molecular graphs, and text, with strong performance even in single-step generation.
- Score: 42.309126712129384
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce Categorical Flow Maps, a flow-matching method for accelerated few-step generation of categorical data via self-distillation. Building on recent variational formulations of flow matching and the broader trend towards accelerated inference in diffusion and flow-based models, we define a flow map towards the simplex that transports probability mass toward a predicted endpoint, yielding a parametrisation that naturally constrains model predictions. Since our trajectories are continuous rather than discrete, Categorical Flow Maps can be trained with existing distillation techniques, as well as a new objective based on endpoint consistency. This continuous formulation also automatically unlocks test-time inference: we can directly reuse existing guidance and reweighting techniques in the categorical setting to steer sampling toward downstream objectives. Empirically, we achieve state-of-the-art few-step results on images, molecular graphs, and text, with strong performance even in single-step generation.
Related papers
- FlowConsist: Make Your Flow Consistent with Real Trajectory [99.22869983378062]
We argue that current fast-flow training paradigms suffer from two fundamental issues.<n> conditional velocities constructed from randomly paired noise-data samples introduce systematic trajectory drift.<n>We propose FlowConsist, a training framework designed to enforce trajectory consistency in fast flows.
arXiv Detail & Related papers (2026-02-06T03:24:23Z) - Test-time scaling of diffusions with flow maps [68.79792714591564]
A common recipe to improve diffusion models at test-time is to introduce the gradient of the reward into the dynamics of the diffusion itself.<n>We propose a simple solution by working directly with a flow map.<n>By exploiting a relationship between the flow map and velocity field governing the instantaneous transport, we construct an algorithm, Flow Map Trajectory Tilting (FMTT), which provably performs better ascent on the reward than standard test-time methods.
arXiv Detail & Related papers (2025-11-27T18:44:12Z) - FlowDrive: moderated flow matching with data balancing for trajectory planning [5.553127690929986]
FlowDrive is a flow-matching trajectory planner that learns a conditional rectified flow to map noise directly to trajectory distributions.<n>FlowDrive achieves state-of-the-art results among learning-based planners and approaches methods with rule-based refinements.
arXiv Detail & Related papers (2025-09-26T06:49:22Z) - Align Your Flow: Scaling Continuous-Time Flow Map Distillation [63.927438959502226]
Flow maps connect any two noise levels in a single step and remain effective across all step counts.<n>We extensively validate our flow map models, called Align Your Flow, on challenging image generation benchmarks.<n>We show text-to-image flow map models that outperform all existing non-adversarially trained few-step samplers in text-conditioned synthesis.
arXiv Detail & Related papers (2025-06-17T15:06:07Z) - FlowMo: Variance-Based Flow Guidance for Coherent Motion in Video Generation [51.110607281391154]
FlowMo is a training-free guidance method for enhancing motion coherence in text-to-video models.<n>It estimates motion coherence by measuring the patch-wise variance across the temporal dimension and guides the model to reduce this variance dynamically during sampling.
arXiv Detail & Related papers (2025-06-01T19:55:33Z) - How to build a consistency model: Learning flow maps via self-distillation [18.299322342860517]
Flow-based generative models achieve state-of-the-art sample quality, but require the expensive solution of a differential equation at inference time.<n>These models lack a unified description that clearly explains how to learn them efficiently in practice.<n>We present a systematic algorithmic framework for directly learning the flow map associated with a flow or diffusion model.
arXiv Detail & Related papers (2025-05-24T18:50:50Z) - SCoT: Unifying Consistency Models and Rectified Flows via Straight-Consistent Trajectories [31.60548236936739]
We propose a Straight Consistent Trajectory(SCoT) model for pre-trained diffusion models.<n>SCoT enjoys the benefits of both approaches for fast sampling, producing trajectories with consistent and straight properties simultaneously.
arXiv Detail & Related papers (2025-02-24T08:57:19Z) - Flow Matching: Markov Kernels, Stochastic Processes and Transport Plans [1.9766522384767222]
Flow matching techniques can be used to solve inverse problems.<n>We show how flow matching can be used for solving inverse problems.<n>We briefly address continuous normalizing flows and score matching techniques.
arXiv Detail & Related papers (2025-01-28T10:28:17Z) - Flow map matching with stochastic interpolants: A mathematical framework for consistency models [15.520853806024943]
Flow Map Matching is a principled framework for learning the two-time flow map of an underlying generative model.<n>We show that FMM unifies and extends a broad class of existing approaches for fast sampling.
arXiv Detail & Related papers (2024-06-11T17:41:26Z) - Optimal Flow Matching: Learning Straight Trajectories in Just One Step [89.37027530300617]
We develop and theoretically justify the novel textbf Optimal Flow Matching (OFM) approach.
It allows recovering the straight OT displacement for the quadratic transport in just one FM step.
The main idea of our approach is the employment of vector field for FM which are parameterized by convex functions.
arXiv Detail & Related papers (2024-03-19T19:44:54Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow [16.41460104376002]
We introduce subset flows, a class of flows that can transform finite volumes and allow exact computation of likelihoods for discrete data.
We identify ordinal discrete autoregressive models, including WaveNets, PixelCNNs and Transformers, as single-layer flows.
We demonstrate state-of-the-art results on CIFAR-10 for flow models trained with dequantization.
arXiv Detail & Related papers (2020-02-06T22:58:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.