Neural Operators Meet Energy-based Theory: Operator Learning for
Hamiltonian and Dissipative PDEs
- URL: http://arxiv.org/abs/2402.09018v1
- Date: Wed, 14 Feb 2024 08:50:14 GMT
- Title: Neural Operators Meet Energy-based Theory: Operator Learning for
Hamiltonian and Dissipative PDEs
- Authors: Yusuke Tanaka, Takaharu Yaguchi, Tomoharu Iwata, Naonori Ueda
- Abstract summary: This paper proposes Energy-consistent Neural Operators (ENOs) for learning solution operators of partial differential equations.
ENOs follows the energy conservation or dissipation law from observed solution trajectories.
We introduce a novel penalty function inspired by the energy-based theory of physics for training, in which the energy functional is modeled by another DNN.
- Score: 35.70739067374375
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The operator learning has received significant attention in recent years,
with the aim of learning a mapping between function spaces. Prior works have
proposed deep neural networks (DNNs) for learning such a mapping, enabling the
learning of solution operators of partial differential equations (PDEs).
However, these works still struggle to learn dynamics that obeys the laws of
physics. This paper proposes Energy-consistent Neural Operators (ENOs), a
general framework for learning solution operators of PDEs that follows the
energy conservation or dissipation law from observed solution trajectories. We
introduce a novel penalty function inspired by the energy-based theory of
physics for training, in which the energy functional is modeled by another DNN,
allowing one to bias the outputs of the DNN-based solution operators to ensure
energetic consistency without explicit PDEs. Experiments on multiple physical
systems show that ENO outperforms existing DNN models in predicting solutions
from data, especially in super-resolution settings.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - DeltaPhi: Learning Physical Trajectory Residual for PDE Solving [54.13671100638092]
We propose and formulate the Physical Trajectory Residual Learning (DeltaPhi)
We learn the surrogate model for the residual operator mapping based on existing neural operator networks.
We conclude that, compared to direct learning, physical residual learning is preferred for PDE solving.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Physics informed WNO [0.0]
We propose a physics-informed Wavelet Operator (WNO) for learning the solution operators of families of parametric partial differential equations (PDEs) without labeled training data.
The efficacy of the framework is validated and illustrated with four nonlinear neural systems relevant to various fields of engineering and science.
arXiv Detail & Related papers (2023-02-12T14:31:50Z) - Koopman neural operator as a mesh-free solver of non-linear partial differential equations [15.410070455154138]
We propose the Koopman neural operator (KNO), a new neural operator, to overcome these challenges.
By approximating the Koopman operator, an infinite-dimensional operator governing all possible observations of the dynamic system, we can equivalently learn the solution of a non-linear PDE family.
The KNO exhibits notable advantages compared with previous state-of-the-art models.
arXiv Detail & Related papers (2023-01-24T14:10:15Z) - An unsupervised latent/output physics-informed convolutional-LSTM
network for solving partial differential equations using peridynamic
differential operator [0.0]
Unsupervised convolutional Neural Network (NN) architecture with nonlocal interactions for solving Partial Differential Equations (PDEs)
PDDO is employed as a convolutional filter for evaluating derivatives the field variable.
NN captures the time-dynamics in smaller latent space through encoder-decoder layers with a Convolutional Long-short Term Memory (ConvLSTM) layer between them.
arXiv Detail & Related papers (2022-10-21T18:09:23Z) - Generic bounds on the approximation error for physics-informed (and)
operator learning [7.6146285961466]
We propose a framework for deriving rigorous bounds on the approximation error for physics-informed neural networks (PINNs) and operator learning architectures such as DeepONets and FNOs.
These bounds guarantee that PINNs and (physics-informed) DeepONets or FNOs will efficiently approximate the underlying solution or solution operator of generic partial differential equations (PDEs)
arXiv Detail & Related papers (2022-05-23T15:40:33Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.