PI-MFM: Physics-informed multimodal foundation model for solving partial differential equations
- URL: http://arxiv.org/abs/2512.23056v1
- Date: Sun, 28 Dec 2025 19:43:57 GMT
- Title: PI-MFM: Physics-informed multimodal foundation model for solving partial differential equations
- Authors: Min Zhu, Jingmin Sun, Zecheng Zhang, Hayden Schaeffer, Lu Lu,
- Abstract summary: We propose a physics-informed multimodal foundation model (PI-MFM) framework that directly enforces governing equations during pretraining and adaptation.<n>PI-MFM takes symbolic representations of PDEs as the input, and automatically assembles PDE residual losses from the input expression.<n>On a benchmark of 13 parametric one-dimensional time-dependent PDE families, PI-MFM consistently outperforms purely data-driven counterparts.
- Score: 6.876642270107136
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Partial differential equations (PDEs) govern a wide range of physical systems, and recent multimodal foundation models have shown promise for learning PDE solution operators across diverse equation families. However, existing multi-operator learning approaches are data-hungry and neglect physics during training. Here, we propose a physics-informed multimodal foundation model (PI-MFM) framework that directly enforces governing equations during pretraining and adaptation. PI-MFM takes symbolic representations of PDEs as the input, and automatically assembles PDE residual losses from the input expression via a vectorized derivative computation. These designs enable any PDE-encoding multimodal foundation model to be trained or adapted with unified physics-informed objectives across equation families. On a benchmark of 13 parametric one-dimensional time-dependent PDE families, PI-MFM consistently outperforms purely data-driven counterparts, especially with sparse labeled spatiotemporal points, partially observed time domains, or few labeled function pairs. Physics losses further improve robustness against noise, and simple strategies such as resampling collocation points substantially improve accuracy. We also analyze the accuracy, precision, and computational cost of automatic differentiation and finite differences for derivative computation within PI-MFM. Finally, we demonstrate zero-shot physics-informed fine-tuning to unseen PDE families: starting from a physics-informed pretrained model, adapting using only PDE residuals and initial/boundary conditions, without any labeled solution data, rapidly reduces test errors to around 1% and clearly outperforms physics-only training from scratch. These results show that PI-MFM provides a practical and scalable path toward data-efficient, transferable PDE solvers.
Related papers
- Soft Partition-based KAPI-ELM for Multi-Scale PDEs [0.0]
This work introduces a soft partition-based Kernel-Adaptive Physics-Informed Extreme Learning Machine.<n>A signed-distance-based weighting stabilizes least-squares learning on irregular frequencies.<n>Although demonstrated on steady linear PDEs, the results show that soft-partition kernel adaptation provides a fast, architecture-free approach for multiscale PDEs.
arXiv Detail & Related papers (2026-01-13T16:43:38Z) - Towards a Foundation Model for Partial Differential Equations Across Physics Domains [1.7115425267046014]
We present PDE-FM, a modular foundation model for physics-informed machine learning.<n>It unifies spatial, spectral, and temporal reasoning across heterogeneous partial differential equation (PDE) systems.<n>PDE-FM is pretrained once on diverse PDE datasets and can be transferred to new physical regimes without architectural or data-specific modifications.
arXiv Detail & Related papers (2025-11-26T19:36:15Z) - SPUS: A Lightweight and Parameter-Efficient Foundation Model for PDEs [40.11476265839176]
We introduce SPUS, a compact and efficient foundation model (FM) designed as a unified neural operator for solving a wide range of partial differential equations (PDEs)<n>SPUS is pretrained on a diverse set of fluid dynamics PDEs and evaluated across 6 challenging unseen downstream PDEs spanning various physical systems.<n> Experimental results demonstrate that SPUS using residual U-Net based architecture achieves state-of-the-art generalization on these downstream tasks.
arXiv Detail & Related papers (2025-10-01T18:54:59Z) - Flow Matching Meets PDEs: A Unified Framework for Physics-Constrained Generation [21.321570407292263]
We propose Physics-Based Flow Matching, a generative framework that embeds physical constraints, both PDE residuals and algebraic relations, into the flow matching objective.<n>We show that our approach yields up to an $8times$ more accurate physical residuals compared to FM, while clearly outperforming existing algorithms in terms of distributional accuracy.
arXiv Detail & Related papers (2025-06-10T09:13:37Z) - Enabling Automatic Differentiation with Mollified Graph Neural Operators [73.52999622724101]
We propose the mollified graph neural operator ($m$GNO), the first method to leverage automatic differentiation and compute exact gradients on arbitrary geometries.<n>For a PDE example on regular grids, $m$GNO paired with autograd reduced the L2 relative data error by 20x compared to finite differences.<n>It can also solve PDEs on unstructured point clouds seamlessly, using physics losses only, at resolutions vastly lower than those needed for finite differences to be accurate enough.
arXiv Detail & Related papers (2025-04-11T06:16:30Z) - Paving the way for scientific foundation models: enhancing generalization and robustness in PDEs with constraint-aware pre-training [49.8035317670223]
A scientific foundation model (SciFM) is emerging as a promising tool for learning transferable representations across diverse domains.<n>We propose incorporating PDE residuals into pre-training either as the sole learning signal or in combination with data loss to compensate for limited or infeasible training data.<n>Our results show that pre-training with PDE constraints significantly enhances generalization, outperforming models trained solely on solution data.
arXiv Detail & Related papers (2025-03-24T19:12:39Z) - MultiPDENet: PDE-embedded Learning with Multi-time-stepping for Accelerated Flow Simulation [48.41289705783405]
We propose a PDE-embedded network with multiscale time stepping (MultiPDENet)<n>In particular, we design a convolutional filter based on the structure of finite difference with a small number of parameters to optimize.<n>A Physics Block with a 4th-order Runge-Kutta integrator at the fine time scale is established that embeds the structure of PDEs to guide the prediction.
arXiv Detail & Related papers (2025-01-27T12:15:51Z) - Unisolver: PDE-Conditional Transformers Towards Universal Neural PDE Solvers [53.79279286773326]
We present Unisolver, a novel Transformer model trained on diverse data and conditioned on diverse PDEs.<n>Unisolver achieves consistent state-of-the-art on three challenging large-scale benchmarks, showing impressive performance and generalizability.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - A Physics-driven GraphSAGE Method for Physical Process Simulations
Described by Partial Differential Equations [2.1217718037013635]
A physics-driven GraphSAGE approach is presented to solve problems governed by irregular PDEs.
A distance-related edge feature and a feature mapping strategy are devised to help training and convergence.
The robust PDE surrogate model for heat conduction problems parameterized by the Gaussian singularity random field source is successfully established.
arXiv Detail & Related papers (2024-03-13T14:25:15Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Physics-constrained Unsupervised Learning of Partial Differential
Equations using Meshes [1.066048003460524]
Graph neural networks show promise in accurately representing irregularly meshed objects and learning their dynamics.
In this work, we represent meshes naturally as graphs, process these using Graph Networks, and formulate our physics-based loss to provide an unsupervised learning framework for partial differential equations (PDE)
Our framework will enable the application of PDE solvers in interactive settings, such as model-based control of soft-body deformations.
arXiv Detail & Related papers (2022-03-30T19:22:56Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.