PMNO: A novel physics guided multi-step neural operator predictor for partial differential equations
- URL: http://arxiv.org/abs/2506.01598v1
- Date: Mon, 02 Jun 2025 12:33:50 GMT
- Title: PMNO: A novel physics guided multi-step neural operator predictor for partial differential equations
- Authors: Jin Song, Kenji Kawaguchi, Zhenya Yan,
- Abstract summary: We propose a novel physics guided multi-step neural operator (PMNO) architecture to address challenges in long-horizon prediction of complex physical systems.<n>The PMNO framework replaces the single-step input with multi-step historical data in the forward pass and introduces an implicit time-stepping scheme during backpropagation.<n>We demonstrate the superior predictive performance of PMNO predictor across a diverse range of physical systems.
- Score: 23.04840527974364
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural operators, which aim to approximate mappings between infinite-dimensional function spaces, have been widely applied in the simulation and prediction of physical systems. However, the limited representational capacity of network architectures, combined with their heavy reliance on large-scale data, often hinder effective training and result in poor extrapolation performance. In this paper, inspired by traditional numerical methods, we propose a novel physics guided multi-step neural operator (PMNO) architecture to address these challenges in long-horizon prediction of complex physical systems. Distinct from general operator learning methods, the PMNO framework replaces the single-step input with multi-step historical data in the forward pass and introduces an implicit time-stepping scheme based on the Backward Differentiation Formula (BDF) during backpropagation. This design not only strengthens the model's extrapolation capacity but also facilitates more efficient and stable training with fewer data samples, especially for long-term predictions. Meanwhile, a causal training strategy is employed to circumvent the need for multi-stage training and to ensure efficient end-to-end optimization. The neural operator architecture possesses resolution-invariant properties, enabling the trained model to perform fast extrapolation on arbitrary spatial resolutions. We demonstrate the superior predictive performance of PMNO predictor across a diverse range of physical systems, including 2D linear system, modeling over irregular domain, complex-valued wave dynamics, and reaction-diffusion processes. Depending on the specific problem setting, various neural operator architectures, including FNO, DeepONet, and their variants, can be seamlessly integrated into the PMNO framework.
Related papers
- OmniFluids: Unified Physics Pre-trained Modeling of Fluid Dynamics [25.066485418709114]
We introduce OmniFluids, a unified physics pre-trained operator learning framework.<n>It integrates physics-only pre-training, coarse-grid operator distillation, and few-shot fine-tuning.<n>It significantly outperforms state-of-the-art AI-driven methods in flow field reconstruction and turbulence statistics accuracy.
arXiv Detail & Related papers (2025-06-12T16:23:02Z) - Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems [49.819436680336786]
We propose an efficient transformed Gaussian process state-space model (ETGPSSM) for scalable and flexible modeling of high-dimensional, non-stationary dynamical systems.<n>Specifically, our ETGPSSM integrates a single shared GP with input-dependent normalizing flows, yielding an expressive implicit process prior that captures complex, non-stationary transition dynamics.<n>Our ETGPSSM outperforms existing GPSSMs and neural network-based SSMs in terms of computational efficiency and accuracy.
arXiv Detail & Related papers (2025-03-24T03:19:45Z) - High-fidelity Multiphysics Modelling for Rapid Predictions Using Physics-informed Parallel Neural Operator [17.85837423448985]
Modelling complex multiphysics systems governed by nonlinear and strongly coupled partial differential equations (PDEs) is a cornerstone in computational science and engineering.<n>We propose a novel paradigm, physics-informed parallel neural operator (PIPNO), a scalable and unsupervised learning framework.<n>PIPNO efficiently captures nonlinear operator mappings across diverse physics, including geotechnical engineering, material science, electromagnetism, quantum mechanics, and fluid dynamics.
arXiv Detail & Related papers (2025-02-26T20:29:41Z) - Physics-Informed Latent Neural Operator for Real-time Predictions of Complex Physical Systems [0.0]
We propose PI-Latent-NO, a physics-informed latent neural operator framework that integrates governing physics directly into the learning process.<n>Our architecture features two coupled DeepONets trained end-to-end: a Latent-DeepONet that learns a low-dimensional representation of the solution, and a Reconstruction-DeepONet that maps this latent representation back to the physical space.
arXiv Detail & Related papers (2025-01-14T20:38:30Z) - Koopman Theory-Inspired Method for Learning Time Advancement Operators in Unstable Flame Front Evolution [0.2812395851874055]
This study introduces Koopman-inspired Fourier Neural Operators (kFNO) and Convolutional Neural Networks (kCNN) to learn solution advancement operators for flame front instabilities.<n>By transforming data into a high-dimensional latent space, these models achieve more accurate multi-step predictions compared to traditional methods.
arXiv Detail & Related papers (2024-12-11T14:47:19Z) - PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)<n>We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.<n>PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.<n>We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.<n>Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.