Physics-Informed Design of Input Convex Neural Networks for Consistency Optimal Transport Flow Matching
- URL: http://arxiv.org/abs/2511.06042v1
- Date: Sat, 08 Nov 2025 15:30:55 GMT
- Title: Physics-Informed Design of Input Convex Neural Networks for Consistency Optimal Transport Flow Matching
- Authors: Fanghui Song, Zhongjian Wang, Jiebao Sun,
- Abstract summary: A physics-informed neural input consistency network (PICNN) plays a central role in constructing the flow field that emulates the displacement.<n>During the prediction stage, our approach supports both one-step (Mrenier-map) and multi-step ODE sampling from the same learned potential, leveraging the straightness of the OT flow.
- Score: 1.3709465727733763
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We propose a consistency model based on the optimal-transport flow. A physics-informed design of partially input-convex neural networks (PICNN) plays a central role in constructing the flow field that emulates the displacement interpolation. During the training stage, we couple the Hamilton-Jacobi (HJ) residual in the OT formulation with the original flow matching loss function. Our approach avoids inner optimization subproblems that are present in previous one-step OFM approaches. During the prediction stage, our approach supports both one-step (Brenier-map) and multi-step ODE sampling from the same learned potential, leveraging the straightness of the OT flow. We validate scalability and performance on standard OT benchmarks.
Related papers
- Variational Entropic Optimal Transport [67.76725267984578]
We propose Variational Entropic Optimal Transport (VarEOT) for domain translation problems.<n>VarEOT is based on an exact variational reformulation of the log-partition $log mathbbE[exp(cdot)$ as a tractable generalization over an auxiliary positive normalizer.<n> Experiments on synthetic data and unpaired image-to-image translation demonstrate competitive or improved translation quality.
arXiv Detail & Related papers (2026-02-02T15:48:44Z) - HFNO: an interpretable data-driven decomposition strategy for turbulent flows [0.0]
We present a novel FNO-based architecture tailored for reduced-order modeling of turbulent fluid flows.<n>The proposed architecture processes wavenumber bins in parallel, enabling approximation of dispersion relations and non-linear interactions.<n>We evaluate the proposed model on a series of increasingly complex dynamical systems.
arXiv Detail & Related papers (2025-11-03T12:57:19Z) - Optimal Flow Matching: Learning Straight Trajectories in Just One Step [89.37027530300617]
We develop and theoretically justify the novel textbf Optimal Flow Matching (OFM) approach.
It allows recovering the straight OT displacement for the quadratic transport in just one FM step.
The main idea of our approach is the employment of vector field for FM which are parameterized by convex functions.
arXiv Detail & Related papers (2024-03-19T19:44:54Z) - Guided Flows for Generative Modeling and Decision Making [55.42634941614435]
We show that Guided Flows significantly improves the sample quality in conditional image generation and zero-shot text synthesis-to-speech.
Notably, we are first to apply flow models for plan generation in the offline reinforcement learning setting ax speedup in compared to diffusion models.
arXiv Detail & Related papers (2023-11-22T15:07:59Z) - Free-form Flows: Make Any Architecture a Normalizing Flow [8.163244519983298]
We develop a training procedure that uses an efficient estimator for the gradient of the change of variables formula.
This enables any dimension-preserving neural network to serve as a generative model through maximum likelihood training.
We achieve excellent results in molecule generation benchmarks utilizing $E(n)$-equivariant networks.
arXiv Detail & Related papers (2023-10-25T13:23:08Z) - Computing high-dimensional optimal transport by flow neural networks [19.859984725284896]
This work proposes to compute the dynamic OT between two arbitrary distributions $P$ and $Q$ by optimizing a flow model.<n>Our method learns the dynamic OT by finding an invertible flow that minimizes the transport cost.<n>The effectiveness of the proposed model on high-dimensional data is demonstrated by strong empirical performance on OT baselines, image-to-image translation, and high-dimensional DRE.
arXiv Detail & Related papers (2023-05-19T17:48:21Z) - Normalizing flow neural networks by JKO scheme [22.320632565424745]
We develop a neural ODE flow network called JKO-iFlow, inspired by the Jordan-Kinderleherer-Otto scheme.
The proposed method stacks residual blocks one after another, allowing efficient block-wise training of the residual blocks.
Experiments with synthetic and real data show that the proposed JKO-iFlow network achieves competitive performance.
arXiv Detail & Related papers (2022-12-29T18:55:00Z) - Manifold Interpolating Optimal-Transport Flows for Trajectory Inference [64.94020639760026]
We present a method called Manifold Interpolating Optimal-Transport Flow (MIOFlow)
MIOFlow learns, continuous population dynamics from static snapshot samples taken at sporadic timepoints.
We evaluate our method on simulated data with bifurcations and merges, as well as scRNA-seq data from embryoid body differentiation, and acute myeloid leukemia treatment.
arXiv Detail & Related papers (2022-06-29T22:19:03Z) - GMFlow: Learning Optical Flow via Global Matching [124.57850500778277]
We propose a GMFlow framework for learning optical flow estimation.
It consists of three main components: a customized Transformer for feature enhancement, a correlation and softmax layer for global feature matching, and a self-attention layer for flow propagation.
Our new framework outperforms 32-iteration RAFT's performance on the challenging Sintel benchmark.
arXiv Detail & Related papers (2021-11-26T18:59:56Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal
Transport [8.468007443062751]
A normalizing flow is an invertible mapping between an arbitrary probability distribution and a standard normal distribution.
OT-Flow tackles two critical computational challenges that limit a more widespread use of CNFs.
On five high-dimensional density estimation and generative modeling tasks, OT-Flow performs competitively to state-of-the-art CNFs.
arXiv Detail & Related papers (2020-05-29T22:31:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.