An Eulerian Perspective on Straight-Line Sampling
- URL: http://arxiv.org/abs/2510.11657v1
- Date: Mon, 13 Oct 2025 17:33:58 GMT
- Title: An Eulerian Perspective on Straight-Line Sampling
- Authors: Panos Tsimpos, Youssef Marzouk,
- Abstract summary: We study dynamic measure transport for generative modeling, specifically, flows induced by processes that bridge a specified source and target distribution.<n>We ask emphwhich processes produce straight-line flows -- i.e., flows whose pointwise acceleration vanishes and thus are exactly integrable with a first-order method?<n>We provide a concise PDE characterization of straightness as a balance between conditional acceleration and the divergence of a weighted covariance (Reynolds) tensor.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study dynamic measure transport for generative modeling: specifically, flows induced by stochastic processes that bridge a specified source and target distribution. The conditional expectation of the process' velocity defines an ODE whose flow map achieves the desired transport. We ask \emph{which processes produce straight-line flows} -- i.e., flows whose pointwise acceleration vanishes and thus are exactly integrable with a first-order method? We provide a concise PDE characterization of straightness as a balance between conditional acceleration and the divergence of a weighted covariance (Reynolds) tensor. Using this lens, we fully characterize affine-in-time interpolants and show that straightness occurs exactly under deterministic endpoint couplings. We also derive necessary conditions that constrain flow geometry for general processes, offering broad guidance for designing transports that are easier to integrate.
Related papers
- Flow Matching is Adaptive to Manifold Structures [32.55405572762157]
Flow matching is a simulation-based alternative to diffusion-based generative modeling.<n>We show how flow matching adapts to data geometry and circumvents the curse of dimensionality.
arXiv Detail & Related papers (2026-02-25T23:52:32Z) - Entropy-Controlled Flow Matching [0.08460698440162889]
We propose a constrained variational principle over continuity-equation paths enforcing a global entropy-rate budget d/dt H(mu_t) >= -lambda.<n>We obtain certificate-style mode-coverage and density-floor guarantees with Lipschitz, and construct near-optimal counterexamples for unconstrained flow matching.
arXiv Detail & Related papers (2026-02-25T06:07:01Z) - FlowConsist: Make Your Flow Consistent with Real Trajectory [99.22869983378062]
We argue that current fast-flow training paradigms suffer from two fundamental issues.<n> conditional velocities constructed from randomly paired noise-data samples introduce systematic trajectory drift.<n>We propose FlowConsist, a training framework designed to enforce trajectory consistency in fast flows.
arXiv Detail & Related papers (2026-02-06T03:24:23Z) - Riemannian Flow Matching for Disentangled Graph Domain Adaptation [51.98961391065951]
Graph Domain Adaptation (GDA) typically uses adversarial learning to align graph embeddings in Euclidean space.<n>DisRFM is a geometry-aware GDA framework that unifies embedding and flow-based transport.
arXiv Detail & Related papers (2026-01-31T11:05:35Z) - On the Relation between Rectified Flows and Optimal Transport [6.493334597338973]
Rectified flow matching aims to straighten the learned transport paths, yielding more direct flows between distributions.<n>Recent claims suggest that rectified flows, when constrained such that the learned velocity field is a gradient, can yield solutions to optimal transport problems.<n>We present several counterexamples that invalidate earlier equivalence results in the literature, and we argue that enforcing a gradient constraint on rectified flows is, in general, not a reliable method for computing optimal transport maps.
arXiv Detail & Related papers (2025-05-26T09:01:53Z) - Flow Matching: Markov Kernels, Stochastic Processes and Transport Plans [1.9766522384767222]
Flow matching techniques can be used to solve inverse problems.<n>We show how flow matching can be used for solving inverse problems.<n>We briefly address continuous normalizing flows and score matching techniques.
arXiv Detail & Related papers (2025-01-28T10:28:17Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - Optimal Flow Matching: Learning Straight Trajectories in Just One Step [89.37027530300617]
We develop and theoretically justify the novel textbf Optimal Flow Matching (OFM) approach.
It allows recovering the straight OT displacement for the quadratic transport in just one FM step.
The main idea of our approach is the employment of vector field for FM which are parameterized by convex functions.
arXiv Detail & Related papers (2024-03-19T19:44:54Z) - Flow Straight and Fast: Learning to Generate and Transfer Data with
Rectified Flow [32.459587479351846]
We present rectified flow, a surprisingly simple approach to learning (neural) ordinary differential equation (ODE) models.
We show that rectified flow performs superbly on image generation, image-to-image translation, and domain adaptation.
arXiv Detail & Related papers (2022-09-07T08:59:55Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.