Frame invariance and scalability of neural operators for partial
differential equations
- URL: http://arxiv.org/abs/2112.14769v1
- Date: Tue, 28 Dec 2021 02:36:19 GMT
- Title: Frame invariance and scalability of neural operators for partial
differential equations
- Authors: Muhammad I. Zafar, Jiequn Han, Xu-Hui Zhou and Heng Xiao
- Abstract summary: Partial differential equations (PDEs) play a dominant role in the mathematical modeling of many complex dynamical processes.
After training, neural operators can provide PDEs solutions significantly faster than traditional PDE solvers.
- Score: 5.872676314924041
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Partial differential equations (PDEs) play a dominant role in the
mathematical modeling of many complex dynamical processes. Solving these PDEs
often requires prohibitively high computational costs, especially when multiple
evaluations must be made for different parameters or conditions. After
training, neural operators can provide PDEs solutions significantly faster than
traditional PDE solvers. In this work, invariance properties and computational
complexity of two neural operators are examined for transport PDE of a scalar
quantity. Neural operator based on graph kernel network (GKN) operates on
graph-structured data to incorporate nonlocal dependencies. Here we propose a
modified formulation of GKN to achieve frame invariance. Vector cloud neural
network (VCNN) is an alternate neural operator with embedded frame invariance
which operates on point cloud data. GKN-based neural operator demonstrates
slightly better predictive performance compared to VCNN. However, GKN requires
an excessively high computational cost that increases quadratically with the
increasing number of discretized objects as compared to a linear increase for
VCNN.
Related papers
- GIT-Net: Generalized Integral Transform for Operator Learning [58.13313857603536]
This article introduces GIT-Net, a deep neural network architecture for approximating Partial Differential Equation (PDE) operators.
GIT-Net harnesses the fact that differential operators commonly used for defining PDEs can often be represented parsimoniously when expressed in specialized functional bases.
Numerical experiments demonstrate that GIT-Net is a competitive neural network operator, exhibiting small test errors and low evaluations across a range of PDE problems.
arXiv Detail & Related papers (2023-12-05T03:03:54Z) - Convolutional Neural Operators for robust and accurate learning of PDEs [11.562748612983956]
We present novel adaptations for convolutional neural networks to process functions as inputs and outputs.
The resulting architecture is termed as convolutional neural operators (CNOs)
We prove a universality theorem to show that CNOs can approximate operators arising in PDEs to desired accuracy.
arXiv Detail & Related papers (2023-02-02T15:54:45Z) - DOSnet as a Non-Black-Box PDE Solver: When Deep Learning Meets Operator
Splitting [12.655884541938656]
We develop a learning-based PDE solver, which we name Deep Operator-Splitting Network (DOSnet)
DOSnet is constructed from the physical rules and operators governing the underlying dynamics contains learnable parameters.
We train and validate it on several types of operator-decomposable differential equations.
arXiv Detail & Related papers (2022-12-11T18:23:56Z) - Sparse Deep Neural Network for Nonlinear Partial Differential Equations [3.0069322256338906]
This paper is devoted to a numerical study of adaptive approximation of solutions of nonlinear partial differential equations.
We develop deep neural networks (DNNs) with a sparse regularization with multiple parameters to represent functions having certain singularities.
Numerical examples confirm that solutions generated by the proposed SDNN are sparse and accurate.
arXiv Detail & Related papers (2022-07-27T03:12:16Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - GrADE: A graph based data-driven solver for time-dependent nonlinear
partial differential equations [0.0]
We propose a novel framework referred to as the Graph Attention Differential Equation (GrADE) for solving time dependent nonlinear PDEs.
The proposed approach couples FNN, graph neural network, and recently developed Neural ODE framework.
Results obtained illustrate the capability of the proposed framework in modeling PDE and its scalability to larger domains without the need for retraining.
arXiv Detail & Related papers (2021-08-24T10:49:03Z) - Neural Operator: Learning Maps Between Function Spaces [75.93843876663128]
We propose a generalization of neural networks to learn operators, termed neural operators, that map between infinite dimensional function spaces.
We prove a universal approximation theorem for our proposed neural operator, showing that it can approximate any given nonlinear continuous operator.
An important application for neural operators is learning surrogate maps for the solution operators of partial differential equations.
arXiv Detail & Related papers (2021-08-19T03:56:49Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.