Learning Physical Operators using Neural Operators
- URL: http://arxiv.org/abs/2602.23113v1
- Date: Thu, 26 Feb 2026 15:27:14 GMT
- Title: Learning Physical Operators using Neural Operators
- Authors: Vignesh Gopakumar, Ander Gray, Dan Giles, Lorenzo Zanisi, Matt J. Kusner, Timo Betcke, Stanislas Pamela, Marc Peter Deisenroth,
- Abstract summary: We train neural operators to learn individual non-linear physical operators while approximating linear operators with fixed finite-difference convolutions.<n>We formulate the modelling task as a neural ordinary differential equation (ODE) where these learned operators constitute the right-hand side.<n>Our approach achieves better convergence and superior performance when generalising to unseen physics.
- Score: 10.57578521926415
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Neural operators have emerged as promising surrogate models for solving partial differential equations (PDEs), but struggle to generalise beyond training distributions and are often constrained to a fixed temporal discretisation. This work introduces a physics-informed training framework that addresses these limitations by decomposing PDEs using operator splitting methods, training separate neural operators to learn individual non-linear physical operators while approximating linear operators with fixed finite-difference convolutions. This modular mixture-of-experts architecture enables generalisation to novel physical regimes by explicitly encoding the underlying operator structure. We formulate the modelling task as a neural ordinary differential equation (ODE) where these learned operators constitute the right-hand side, enabling continuous-in-time predictions through standard ODE solvers and implicitly enforcing PDE constraints. Demonstrated on incompressible and compressible Navier-Stokes equations, our approach achieves better convergence and superior performance when generalising to unseen physics. The method remains parameter-efficient, enabling temporal extrapolation beyond training horizons, and provides interpretable components whose behaviour can be verified against known physics.
Related papers
- Test-time Generalization for Physics through Neural Operator Splitting [12.59844119790293]
We introduce a neural operator splitting strategy that searches over compositions of training operators to approximate unseen dynamics.<n>Our approach achieves state-of-the-art zero-shot generalization results, while being able to recover the underlying PDE parameters.
arXiv Detail & Related papers (2026-01-31T20:08:57Z) - CompNO: A Novel Foundation Model approach for solving Partial Differential Equations [0.0]
Partial differential equations govern a wide range of physical phenomena, but their numerical solution remains computationally demanding.<n>Recent Scientific Foundation Models (SFMs) aim to alleviate this cost by learning universal surrogates from large collections of simulated systems.<n>We introduce Compositional Neural Operators (CompNO), a compositional neural operator framework for parametric PDEs.
arXiv Detail & Related papers (2026-01-12T10:04:48Z) - Expanding the Chaos: Neural Operator for Stochastic (Partial) Differential Equations [65.80144621950981]
We build on Wiener chaos expansions (WCE) to design neural operator (NO) architectures for SPDEs and SDEs.<n>We show that WCE-based neural operators provide a practical and scalable way to learn SDE/SPDE solution operators.
arXiv Detail & Related papers (2026-01-03T00:59:25Z) - Fourier Neural Operators Explained: A Practical Perspective [75.12291469255794]
The Fourier Neural Operator (FNO) has become the most influential and widely adopted due to its elegant spectral formulation.<n>This guide aims to establish a clear and reliable framework for applying FNOs effectively across diverse scientific and engineering fields.
arXiv Detail & Related papers (2025-12-01T08:56:21Z) - Accelerating PDE Solvers with Equation-Recast Neural Operator Preconditioning [9.178290601589365]
Minimal-Data Parametric Neural Operator Preconditioning (MD-PNOP) is a new paradigm for accelerating parametric PDE solvers.<n>It recasts the residual from parameter deviation as additional source term, where trained neural operators can be used to refine the solution in an offline fashion.<n>It consistently achieves 50% reduction in computational time while maintaining full order fidelity for fixed-source, single-group eigenvalue, and multigroup coupled eigenvalue problems.
arXiv Detail & Related papers (2025-09-01T12:14:58Z) - DimINO: Dimension-Informed Neural Operator Learning [41.37905663176428]
DimINO is a framework inspired by dimensional analysis.<n>It can be seamlessly integrated into existing neural operator architectures.<n>It achieves up to 76.3% performance gain on PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - PICL: Physics Informed Contrastive Learning for Partial Differential Equations [7.136205674624813]
We develop a novel contrastive pretraining framework that improves neural operator generalization across multiple governing equations simultaneously.
A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function.
We find that physics-informed contrastive pretraining improves accuracy for the Fourier Neural Operator in fixed-future and autoregressive rollout tasks for the 1D and 2D Heat, Burgers', and linear advection equations.
arXiv Detail & Related papers (2024-01-29T17:32:22Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.<n>We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.<n>Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Koopman neural operator as a mesh-free solver of non-linear partial differential equations [15.410070455154138]
We propose the Koopman neural operator (KNO), a new neural operator, to overcome these challenges.
By approximating the Koopman operator, an infinite-dimensional operator governing all possible observations of the dynamic system, we can equivalently learn the solution of a non-linear PDE family.
The KNO exhibits notable advantages compared with previous state-of-the-art models.
arXiv Detail & Related papers (2023-01-24T14:10:15Z) - Generalized Neural Closure Models with Interpretability [28.269731698116257]
We develop a novel and versatile methodology of unified neural partial delay differential equations.
We augment existing/low-fidelity dynamical models directly in their partial differential equation (PDE) forms with both Markovian and non-Markovian neural network (NN) closure parameterizations.
We demonstrate the new generalized neural closure models (gnCMs) framework using four sets of experiments based on advecting nonlinear waves, shocks, and ocean acidification models.
arXiv Detail & Related papers (2023-01-15T21:57:43Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - STEER: Simple Temporal Regularization For Neural ODEs [80.80350769936383]
We propose a new regularization technique: randomly sampling the end time of the ODE during training.
The proposed regularization is simple to implement, has negligible overhead and is effective across a wide variety of tasks.
We show through experiments on normalizing flows, time series models and image recognition that the proposed regularization can significantly decrease training time and even improve performance over baseline models.
arXiv Detail & Related papers (2020-06-18T17:44:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.