Flow marching for a generative PDE foundation model
- URL: http://arxiv.org/abs/2509.18611v1
- Date: Tue, 23 Sep 2025 04:00:41 GMT
- Title: Flow marching for a generative PDE foundation model
- Authors: Zituo Chen, Sili Deng,
- Abstract summary: We propose Flow Marching, an algorithm that bridges neural operator learning with flow matching motivated by an analysis of error accumulation in physical dynamical systems.<n>We also introduce a Physics-Pretrained Variational Autoencoder (P2E) to embed physical trajectories into a compact latent space.<n>We curate a corpus of 2.5M trajectories across 12 distinct PDE families and train suites of P2Es and FMTs at multiple scales.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Pretraining on large-scale collections of PDE-governed spatiotemporal trajectories has recently shown promise for building generalizable models of dynamical systems. Yet most existing PDE foundation models rely on deterministic Transformer architectures, which lack generative flexibility for many science and engineering applications. We propose Flow Marching, an algorithm that bridges neural operator learning with flow matching motivated by an analysis of error accumulation in physical dynamical systems, and we build a generative PDE foundation model on top of it. By jointly sampling the noise level and the physical time step between adjacent states, the model learns a unified velocity field that transports a noisy current state toward its clean successor, reducing long-term rollout drift while enabling uncertainty-aware ensemble generations. Alongside this core algorithm, we introduce a Physics-Pretrained Variational Autoencoder (P2VAE) to embed physical states into a compact latent space, and an efficient Flow Marching Transformer (FMT) that combines a diffusion-forcing scheme with latent temporal pyramids, achieving up to 15x greater computational efficiency than full-length video diffusion models and thereby enabling large-scale pretraining at substantially reduced cost. We curate a corpus of ~2.5M trajectories across 12 distinct PDE families and train suites of P2VAEs and FMTs at multiple scales. On downstream evaluation, we benchmark on unseen Kolmogorov turbulence with few-shot adaptation, demonstrate long-term rollout stability over deterministic counterparts, and present uncertainty-stratified ensemble results, highlighting the importance of generative PDE foundation models for real-world applications.
Related papers
- Out-of-distribution transfer of PDE foundation models to material dynamics under extreme loading [86.6550968435969]
Most PDE foundation models are pretrained and fine-tuned on fluid-centric benchmarks.<n>We benchmark out-of-distribution transfer on two discontinuity-dominated regimes in which shocks, evolving interfaces, and fracture produce highly non-smooth fields.<n>We evaluate two open-source PDE foundation models, POSEIDON and MORPH, and compare fine-tuning from pretrained weights against training from scratch across training-set sizes to quantify sample efficiency under distribution shift.
arXiv Detail & Related papers (2026-03-04T18:19:35Z) - Latent Generative Solvers for Generalizable Long-Term Physics Simulation [12.894423121609526]
Latent Generative Solvers (LGS) is a framework that maps diverse PDE states into a shared latent physics space with a pretrained VAE.<n>Our key mechanism is an uncertainty knob that perturbs latent inputs during training and inference, teaching the solver to correct off-manifold rollout drift.<n>LGS matches strong deterministic neural-operator baselines on short horizons while substantially reducing rollout drift on long horizons.
arXiv Detail & Related papers (2026-02-11T15:34:52Z) - Physics-Based Machine Learning Closures and Wall Models for Hypersonic Transition-Continuum Boundary Layer Predictions [0.9320657506524149]
We develop a physics-constrained machine learning framework that augments transport models and boundary conditions.<n>We evaluate these for two-dimensional supersonic flat-plate flows across a range of Mach and Knudsen numbers.<n>Our results show that a trace-free anisotropic viscosity model, paired with the skewed-Gaussian distribution function wall model, achieves significantly improved accuracy.
arXiv Detail & Related papers (2025-07-11T19:40:00Z) - Elucidated Rolling Diffusion Models for Probabilistic Weather Forecasting [52.6508222408558]
We introduce Elucidated Rolling Diffusion Models (ERDM)<n>ERDM is the first framework to unify a rolling forecast structure with the principled, performant design of Elucidated Diffusion Models (EDM)<n>On 2D Navier-Stokes simulations and ERA5 global weather forecasting at 1.5circ resolution, ERDM consistently outperforms key diffusion-based baselines.
arXiv Detail & Related papers (2025-06-24T21:44:31Z) - Flow Matching Meets PDEs: A Unified Framework for Physics-Constrained Generation [21.321570407292263]
We propose Physics-Based Flow Matching, a generative framework that embeds physical constraints, both PDE residuals and algebraic relations, into the flow matching objective.<n>We show that our approach yields up to an $8times$ more accurate physical residuals compared to FM, while clearly outperforming existing algorithms in terms of distributional accuracy.
arXiv Detail & Related papers (2025-06-10T09:13:37Z) - Generative Latent Neural PDE Solver using Flow Matching [8.397730500554047]
We propose a latent diffusion model for PDE simulation that embeds the PDE state in a lower-dimensional latent space.<n>Our framework uses an autoencoder to map different types of meshes onto a unified structured latent grid, capturing complex geometries.<n> Numerical experiments show that the proposed model outperforms several deterministic baselines in both accuracy and long-term stability.
arXiv Detail & Related papers (2025-03-28T16:44:28Z) - DPOT: Auto-Regressive Denoising Operator Transformer for Large-Scale PDE Pre-Training [87.90342423839876]
We present a new auto-regressive denoising pre-training strategy, which allows for more stable and efficient pre-training on PDE data.
We train our PDE foundation model with up to 0.5B parameters on 10+ PDE datasets with more than 100k trajectories.
arXiv Detail & Related papers (2024-03-06T08:38:34Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - PDE-Refiner: Achieving Accurate Long Rollouts with Neural PDE Solvers [40.097474800631]
Time-dependent partial differential equations (PDEs) are ubiquitous in science and engineering.
Deep neural network based surrogates have gained increased interest.
arXiv Detail & Related papers (2023-08-10T17:53:05Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Haar Wavelet based Block Autoregressive Flows for Trajectories [129.37479472754083]
Prediction of trajectories such as that of pedestrians is crucial to the performance of autonomous agents.
We introduce a novel Haar wavelet based block autoregressive model leveraging split couplings.
We illustrate the advantages of our approach for generating diverse and accurate trajectories on two real-world datasets.
arXiv Detail & Related papers (2020-09-21T13:57:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.