Deep Neural ODE Operator Networks for PDEs
- URL: http://arxiv.org/abs/2510.15651v1
- Date: Fri, 17 Oct 2025 13:42:57 GMT
- Title: Deep Neural ODE Operator Networks for PDEs
- Authors: Ziqian Li, Kang Liu, Yongcun Song, Hangrui Yue, Enrique Zuazua,
- Abstract summary: This paper introduces a deep neural ordinary differential equation (ODE) operator network framework, termed NODE-ONet.<n>The framework adopts an encoder-decoder architecture comprising three core components: an encoder that spatially discretizes input functions, a neural ODE capturing latent temporal dynamics, and a decoder reconstructing solutions in physical spaces.<n> Numerical experiments on nonlinear diffusion-reaction and Navier-Stokes equations demonstrate high accuracy, computational efficiency, and prediction capabilities beyond training time frames.
- Score: 6.699464491012708
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Operator learning has emerged as a promising paradigm for developing efficient surrogate models to solve partial differential equations (PDEs). However, existing approaches often overlook the domain knowledge inherent in the underlying PDEs and hence suffer from challenges in capturing temporal dynamics and generalization issues beyond training time frames. This paper introduces a deep neural ordinary differential equation (ODE) operator network framework, termed NODE-ONet, to alleviate these limitations. The framework adopts an encoder-decoder architecture comprising three core components: an encoder that spatially discretizes input functions, a neural ODE capturing latent temporal dynamics, and a decoder reconstructing solutions in physical spaces. Theoretically, error analysis for the encoder-decoder architecture is investigated. Computationally, we propose novel physics-encoded neural ODEs to incorporate PDE-specific physical properties. Such well-designed neural ODEs significantly reduce the framework's complexity while enhancing numerical efficiency, robustness, applicability, and generalization capacity. Numerical experiments on nonlinear diffusion-reaction and Navier-Stokes equations demonstrate high accuracy, computational efficiency, and prediction capabilities beyond training time frames. Additionally, the framework's flexibility to accommodate diverse encoders/decoders and its ability to generalize across related PDE families further underscore its potential as a scalable, physics-encoded tool for scientific machine learning.
Related papers
- Learning Data-Efficient and Generalizable Neural Operators via Fundamental Physics Knowledge [8.269904705399474]
Recent advances in machine learning have enabled neural operators to serve as powerful surrogates for modeling the evolution of physical systems.<n>We propose a multiphysics training framework that jointly learns from both the original PDEs and their simplified basic forms.<n>Our framework enhances data efficiency, reduces predictive errors, and improves out-of-distribution (OOD) generalization.
arXiv Detail & Related papers (2026-02-16T20:45:10Z) - DGenNO: A Novel Physics-aware Neural Operator for Solving Forward and Inverse PDE Problems based on Deep, Generative Probabilistic Modeling [1.8416014644193066]
Deep Generative Neural Operator (DGenNO) is a physics-aware framework for solving PDE-based inverse problems.<n>DGenNO enforces physics constraints without labeled data by incorporating as virtual observables, weak-form residuals based on compactly supported radial basis functions.<n>We show that DGenNO achieves higher accuracy across multiple benchmarks while exhibiting robustness to noise and strong generalization to out-of-distribution cases.
arXiv Detail & Related papers (2025-02-10T08:28:46Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)<n>We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.<n>We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - DimINO: Dimension-Informed Neural Operator Learning [41.37905663176428]
DimINO is a framework inspired by dimensional analysis.<n>It can be seamlessly integrated into existing neural operator architectures.<n>It achieves up to 76.3% performance gain on PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Text2PDE: Latent Diffusion Models for Accessible Physics Simulation [7.16525545814044]
We introduce several methods to apply latent diffusion models to physics simulation.<n>We show that the proposed approach is competitive with current neural PDE solvers in both accuracy and efficiency.<n>By introducing a scalable, accurate, and usable physics simulator, we hope to bring neural PDE solvers closer to practical use.
arXiv Detail & Related papers (2024-10-02T01:09:47Z) - Liquid Fourier Latent Dynamics Networks for fast GPU-based numerical simulations in computational cardiology [0.0]
We propose an extension of Latent Dynamics Networks (LDNets) to create parameterized space-time surrogate models for multiscale and multiphysics sets of highly nonlinear differential equations on complex geometries.
LFLDNets employ a neurologically-inspired, sparse liquid neural network for temporal dynamics, relaxing the requirement of a numerical solver for time advancement and leading to superior performance in terms of parameters, accuracy, efficiency and learned trajectories.
arXiv Detail & Related papers (2024-08-19T09:14:25Z) - DeltaPhi: Physical States Residual Learning for Neural Operators in Data-Limited PDE Solving [54.605760146540234]
DeltaPhi is a novel learning framework that transforms the PDE solving task from learning direct input-output mappings to learning the residuals between similar physical states.<n>Extensive experiments demonstrate consistent and significant improvements across diverse physical systems.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Solving inverse-PDE problems with physics-aware neural networks [0.0]
We propose a novel framework to find unknown fields in the context of inverse problems for partial differential equations.
We blend the high expressibility of deep neural networks as universal function estimators with the accuracy and reliability of existing numerical algorithms.
arXiv Detail & Related papers (2020-01-10T18:46:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.