Unfolding Generative Flows with Koopman Operators: Fast and Interpretable Sampling
- URL: http://arxiv.org/abs/2506.22304v2
- Date: Tue, 21 Oct 2025 20:27:13 GMT
- Title: Unfolding Generative Flows with Koopman Operators: Fast and Interpretable Sampling
- Authors: Erkan Turan, Aristotelis Siozopoulos, Louis Martinez, Julien Gaubil, Emery Pierson, Maks Ovsjanikov,
- Abstract summary: Continuous Normalizing Flows (CNFs) enable elegant generative modeling but remain bottlenecked by slow sampling.<n>Recent approaches such as Rectified Flow and OT-CFM accelerate sampling by straightening trajectories, yet the learned dynamics remain nonlinear black boxes.<n>We propose globally linearizing flow dynamics via Koopman theory.
- Score: 25.420559502119485
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Continuous Normalizing Flows (CNFs) enable elegant generative modeling but remain bottlenecked by slow sampling: producing a single sample requires solving a nonlinear ODE with hundreds of function evaluations. Recent approaches such as Rectified Flow and OT-CFM accelerate sampling by straightening trajectories, yet the learned dynamics remain nonlinear black boxes, limiting both efficiency and interpretability. We propose a fundamentally different perspective: globally linearizing flow dynamics via Koopman theory. By lifting Conditional Flow Matching (CFM) into a higher-dimensional Koopman space, we represent its evolution with a single linear operator. This yields two key benefits. First, sampling becomes one-step and parallelizable, computed in closed form via the matrix exponential. Second, the Koopman operator provides a spectral blueprint of generation, enabling novel interpretability through its eigenvalues and modes. We derive a practical, simulation-free training objective that enforces infinitesimal consistency with the teacher's dynamics and show that this alignment preserves fidelity along the full generative path, distinguishing our method from boundary-only distillation. Empirically, our approach achieves competitive sample quality with dramatic speedups, while uniquely enabling spectral analysis of generative flows.
Related papers
- Is Flow Matching Just Trajectory Replay for Sequential Data? [46.770624059457724]
Flow matching (FM) is increasingly used for time-series generation.<n>It is not well understood whether it learns a general dynamical structure or simply performs an effective "trajectory replay"<n>We show that the implied sampler is an ODE whose dynamics constitutes a nonparametric, memory-augmented continuous-time dynamical system.
arXiv Detail & Related papers (2026-02-09T06:48:45Z) - Euphonium: Steering Video Flow Matching via Process Reward Gradient Guided Stochastic Dynamics [49.242224984144904]
We propose Euphonium, a novel framework that steers generation via process reward gradient guided dynamics.<n>Our key insight is to formulate the sampling process as a theoretically principled algorithm that explicitly incorporates the gradient of a Process Reward Model.<n>We derive a distillation objective that internalizes the guidance signal into the flow network, eliminating inference-time dependency on the reward model.
arXiv Detail & Related papers (2026-02-04T08:59:57Z) - FALCON: Few-step Accurate Likelihoods for Continuous Flows [78.37361800856583]
We propose Few-step Accurate Likelihoods for Continuous Flows (FALCON), which allows for few-step sampling with a likelihood accurate enough for importance sampling applications.<n>We show FALCON outperforms state-of-the-art normalizing flow models for molecular Boltzmann sampling and is two orders of magnitude faster than the equivalently performing CNF model.
arXiv Detail & Related papers (2025-12-10T18:47:25Z) - Sequence Modeling with Spectral Mean Flows [18.38715347739777]
Key question in sequence modeling with neural networks is how to represent and learn highly nonlinear and probabilistic state dynamics.<n>We propose a new approach to sequence modeling based on an operator-theoretic view of a hidden Markov tensor (HMM)<n>A generative process is then defined as maximum mean discrepancy (MMD) gradient flow in the space of sequences.
arXiv Detail & Related papers (2025-10-17T06:56:57Z) - Hierarchical Koopman Diffusion: Fast Generation with Interpretable Diffusion Trajectory [30.327899232038863]
textbfHierarchical Koopman Diffusion is a novel framework that achieves both one-step sampling and interpretable generative trajectories.<n>Our framework bridges the gap between fast sampling and interpretability in diffusion models, paving the way for explainable image synthesis in generative modeling.
arXiv Detail & Related papers (2025-10-14T07:17:35Z) - One-Step Offline Distillation of Diffusion-based Models via Koopman Modeling [26.913398550088477]
We introduce the Koopman Distillation Model (KDM), a novel offline distillation approach grounded in Koopman theory.<n>KDM encodes noisy inputs into an embedded space where a learned linear operator propagates them forward, followed by a decoder that reconstructs clean samples.<n>KDM achieves highly competitive performance across standard offline distillation benchmarks.
arXiv Detail & Related papers (2025-05-19T16:59:47Z) - FlowDAS: A Stochastic Interpolant-based Framework for Data Assimilation [15.64941169350615]
Data assimilation (DA) integrates observations with a dynamical model to estimate states of PDE-governed systems.<n>FlowDAS is a generative DA framework that uses interpolants to learn state transition dynamics.<n>We show that FlowDAS surpasses model-driven methods, neural operators, and score-based baselines in accuracy and physical plausibility.
arXiv Detail & Related papers (2025-01-13T05:03:41Z) - Local Flow Matching Generative Models [19.859984725284896]
Local Flow Matching is a computational framework for density estimation based on flow-based generative models.<n>$textttLFM$ employs a simulation-free scheme and incrementally learns a sequence of Flow Matching sub-models.<n>We demonstrate the improved training efficiency and competitive generative performance of $textttLFM$ compared to FM.
arXiv Detail & Related papers (2024-10-03T14:53:10Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.<n>We train the model using maximum likelihood estimation with Markov chain Monte Carlo.<n> Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - Koopman-Based Surrogate Modelling of Turbulent Rayleigh-Bénard Convection [4.248022697109535]
We use a Koopman-inspired architecture called the Linear Recurrent Autoencoder Network (LRAN) for learning reduced-order dynamics in convection flows.
A traditional fluid dynamics method, the Kernel Dynamic Mode Decomposition (KDMD) is used to compare the LRAN.
We obtained more accurate predictions with the LRAN than with KDMD in the most turbulent setting.
arXiv Detail & Related papers (2024-05-10T12:15:02Z) - Variational Flow Models: Flowing in Your Style [32.913511518425864]
We transform the probability flow of a "linear" process into a straight constant-speed (SC) flow, reminiscent of Rectified Flow.
This transformation facilitates fast sampling along the original probability flow via the Euler method without training a new model of the SC flow.
We can easily integrate high-order numerical solvers into the transformed SC flow, further enhancing the sampling accuracy and efficiency.
arXiv Detail & Related papers (2024-02-05T12:58:29Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - DiffuSeq-v2: Bridging Discrete and Continuous Text Spaces for
Accelerated Seq2Seq Diffusion Models [58.450152413700586]
We introduce a soft absorbing state that facilitates the diffusion model in learning to reconstruct discrete mutations based on the underlying Gaussian space.
We employ state-of-the-art ODE solvers within the continuous space to expedite the sampling process.
Our proposed method effectively accelerates the training convergence by 4x and generates samples of similar quality 800x faster.
arXiv Detail & Related papers (2023-10-09T15:29:10Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic
Analysis For DDIM-Type Samplers [90.45898746733397]
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling.
We show that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current gradient.
arXiv Detail & Related papers (2023-03-06T18:59:19Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - Towards extraction of orthogonal and parsimonious non-linear modes from
turbulent flows [0.0]
We propose a deep probabilistic-neural-network architecture for learning a minimal and near-orthogonal set of non-linear modes.
Our approach is based on $beta$-variational autoencoders ($beta$-VAEs) and convolutional neural networks (CNNs)
arXiv Detail & Related papers (2021-09-03T13:38:51Z) - Deep Learning Enhanced Dynamic Mode Decomposition [0.0]
We use convolutional autoencoder networks to simultaneously find optimal families of observables.
We also generate both accurate embeddings of the flow into a space of observables and immersions of the observables back into flow coordinates.
This network results in a global transformation of the flow and affords future state prediction via EDMD and the decoder network.
arXiv Detail & Related papers (2021-08-10T03:54:23Z) - Discrete Denoising Flows [87.44537620217673]
We introduce a new discrete flow-based model for categorical random variables: Discrete Denoising Flows (DDFs)
In contrast with other discrete flow-based models, our model can be locally trained without introducing gradient bias.
We show that DDFs outperform Discrete Flows on modeling a toy example, binary MNIST and Cityscapes segmentation maps, measured in log-likelihood.
arXiv Detail & Related papers (2021-07-24T14:47:22Z) - Flow-based Spatio-Temporal Structured Prediction of Motion Dynamics [21.24885597341643]
Conditional Flows (CNFs) are flexible generative models capable of representing complicated distributions with high dimensionality and interdimensional correlations.
We propose MotionFlow as a novel approach that autoregressively normalizes the output on the temporal input features.
We apply our method to different tasks, including prediction, motion prediction time series forecasting, and binary segmentation.
arXiv Detail & Related papers (2021-04-09T14:30:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.