Towards Generalizable PDE Dynamics Forecasting via Physics-Guided Invariant Learning
- URL: http://arxiv.org/abs/2509.24332v1
- Date: Mon, 29 Sep 2025 06:30:01 GMT
- Title: Towards Generalizable PDE Dynamics Forecasting via Physics-Guided Invariant Learning
- Authors: Siyang Li, Yize Chen, Yan Guo, Ming Huang, Hui Xiong,
- Abstract summary: We propose a physics-guided invariant learning method termed iMOOE.<n>iMOOE features an Invariance-aligned Mixture Of Expert architecture and a frequency-enriched invariant learning objective.<n>Experiments validate iMOOE's superior in-distribution performance and zero-shot generalization capabilities on diverse OOD forecasting scenarios.
- Score: 16.66374733932547
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Advanced deep learning-based approaches have been actively applied to forecast the spatiotemporal physical dynamics governed by partial differential equations (PDEs), which acts as a critical procedure in tackling many science and engineering problems. As real-world physical environments like PDE system parameters are always capricious, how to generalize across unseen out-of-distribution (OOD) forecasting scenarios using limited training data is of great importance. To bridge this barrier, existing methods focus on discovering domain-generalizable representations across various PDE dynamics trajectories. However, their zero-shot OOD generalization capability remains deficient, since extra test-time samples for domain-specific adaptation are still required. This is because the fundamental physical invariance in PDE dynamical systems are yet to be investigated or integrated. To this end, we first explicitly define a two-fold PDE invariance principle, which points out that ingredient operators and their composition relationships remain invariant across different domains and PDE system evolution. Next, to capture this two-fold PDE invariance, we propose a physics-guided invariant learning method termed iMOOE, featuring an Invariance-aligned Mixture Of Operator Expert architecture and a frequency-enriched invariant learning objective. Extensive experiments across simulated benchmarks and real-world applications validate iMOOE's superior in-distribution performance and zero-shot generalization capabilities on diverse OOD forecasting scenarios.
Related papers
- Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge [8.269904705399474]
Recent advances in machine learning have enabled neural operators to serve as powerful surrogates for modeling the evolution of physical systems.<n>We propose a multiphysics training framework that jointly learns from both the original PDEs and their simplified basic forms.<n>Our framework enhances data efficiency, reduces predictive errors, and improves out-of-distribution (OOD) generalization.
arXiv Detail & Related papers (2026-02-16T20:45:10Z) - Out-of-distribution generalization of deep-learning surrogates for 2D PDE-generated dynamics in the small-data regime [1.9116784879310027]
We study autoregressive deep-learning surrogates for two-dimensional PDE dynamics on periodic domains.<n>In small-data periodic 2D PDE settings, convolutional architectures with inductive biases aligned to locality remain strong contenders for accurate and moderately out-of-distribution-robust surrogate modeling.
arXiv Detail & Related papers (2026-01-13T10:20:59Z) - Expanding the Chaos: Neural Operator for Stochastic (Partial) Differential Equations [65.80144621950981]
We build on Wiener chaos expansions (WCE) to design neural operator (NO) architectures for SPDEs and SDEs.<n>We show that WCE-based neural operators provide a practical and scalable way to learn SDE/SPDE solution operators.
arXiv Detail & Related papers (2026-01-03T00:59:25Z) - Governing Equation Discovery from Data Based on Differential Invariants [52.2614860099811]
We propose a pipeline for governing equation discovery based on differential invariants.<n>Specifically, we compute the set of differential invariants corresponding to the infinitesimal generators of the symmetry group.<n>Taking DI-SINDy as an example, we demonstrate that its success rate and accuracy in PDE discovery surpass those of other symmetry-informed governing equation discovery methods.
arXiv Detail & Related papers (2025-05-24T17:19:02Z) - Paving the way for scientific foundation models: enhancing generalization and robustness in PDEs with constraint-aware pre-training [49.8035317670223]
A scientific foundation model (SciFM) is emerging as a promising tool for learning transferable representations across diverse domains.<n>We propose incorporating PDE residuals into pre-training either as the sole learning signal or in combination with data loss to compensate for limited or infeasible training data.<n>Our results show that pre-training with PDE constraints significantly enhances generalization, outperforming models trained solely on solution data.
arXiv Detail & Related papers (2025-03-24T19:12:39Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)<n>We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.<n>We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Physics-informed Discretization-independent Deep Compositional Operator Network [1.2430809884830318]
We introduce a novel physics-informed model architecture which can generalize to various discrete representations of PDE parameters and irregular domain shapes.
Inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly.
Numerical results demonstrate the accuracy and efficiency of the proposed method.
arXiv Detail & Related papers (2024-04-21T12:41:30Z) - Deciphering and integrating invariants for neural operator learning with
various physical mechanisms [22.508244510177683]
We propose Physical Invariant Attention Neural Operator (PIANO) to decipher and integrate the physical invariants (PI) for operator learning from the PDE series with various physical mechanisms.
Compared to existing techniques, PIANO can reduce the relative error by 13.6%-82.2% on PDE forecasting tasks across varying coefficients, forces, or boundary conditions.
arXiv Detail & Related papers (2023-11-24T09:03:52Z) - PDE+: Enhancing Generalization via PDE with Adaptive Distributional
Diffusion [66.95761172711073]
generalization of neural networks is a central challenge in machine learning.
We propose to enhance it directly through the underlying function of neural networks, rather than focusing on adjusting input data.
We put this theoretical framework into practice as $textbfPDE+$ ($textbfPDE$ with $textbfA$daptive $textbfD$istributional $textbfD$iffusion)
arXiv Detail & Related papers (2023-05-25T08:23:26Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Long-time integration of parametric evolution equations with
physics-informed DeepONets [0.0]
We introduce an effective framework for learning infinite-dimensional operators that map random initial conditions to associated PDE solutions within a short time interval.
Global long-time predictions across a range of initial conditions can be then obtained by iteratively evaluating the trained model.
This introduces a new approach to temporal domain decomposition that is shown to be effective in performing accurate long-time simulations.
arXiv Detail & Related papers (2021-06-09T20:46:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.