RBF-MGN:Solving spatiotemporal PDEs with Physics-informed Graph Neural
Network
- URL: http://arxiv.org/abs/2212.02861v1
- Date: Tue, 6 Dec 2022 10:08:02 GMT
- Title: RBF-MGN:Solving spatiotemporal PDEs with Physics-informed Graph Neural
Network
- Authors: Zixue Xiang, Wei Peng, Wen Yao
- Abstract summary: We propose a novel framework based on graph neural networks (GNNs) and radial basis function finite difference (RBF-FD)
RBF-FD is used to construct a high-precision difference format of the differential equations to guide model training.
We illustrate the generalizability, accuracy, and efficiency of the proposed algorithms on different PDE parameters.
- Score: 4.425915683879297
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) have lately received significant
attention as a representative deep learning-based technique for solving partial
differential equations (PDEs). Most fully connected network-based PINNs use
automatic differentiation to construct loss functions that suffer from slow
convergence and difficult boundary enforcement. In addition, although
convolutional neural network (CNN)-based PINNs can significantly improve
training efficiency, CNNs have difficulty in dealing with irregular geometries
with unstructured meshes. Therefore, we propose a novel framework based on
graph neural networks (GNNs) and radial basis function finite difference
(RBF-FD). We introduce GNNs into physics-informed learning to better handle
irregular domains with unstructured meshes. RBF-FD is used to construct a
high-precision difference format of the differential equations to guide model
training. Finally, we perform numerical experiments on Poisson and wave
equations on irregular domains. We illustrate the generalizability, accuracy,
and efficiency of the proposed algorithms on different PDE parameters, numbers
of collection points, and several types of RBFs.
Related papers
- DFA-GNN: Forward Learning of Graph Neural Networks by Direct Feedback Alignment [57.62885438406724]
Graph neural networks are recognized for their strong performance across various applications.
BP has limitations that challenge its biological plausibility and affect the efficiency, scalability and parallelism of training neural networks for graph-based tasks.
We propose DFA-GNN, a novel forward learning framework tailored for GNNs with a case study of semi-supervised learning.
arXiv Detail & Related papers (2024-06-04T07:24:51Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - PhyGNNet: Solving spatiotemporal PDEs with Physics-informed Graph Neural
Network [12.385926494640932]
We propose PhyGNNet for solving partial differential equations on the basics of a graph neural network.
In particular, we divide the computing area into regular grids, define partial differential operators on the grids, then construct pde loss for the network to optimize to build PhyGNNet model.
arXiv Detail & Related papers (2022-08-07T13:33:34Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Enforcing Continuous Physical Symmetries in Deep Learning Network for
Solving Partial Differential Equations [3.6317085868198467]
We introduce a new method, symmetry-enhanced physics informed neural network (SPINN) where the invariant surface conditions induced by the Lie symmetries of PDEs are embedded into the loss function of PINN.
We show that SPINN performs better than PINN with fewer training points and simpler architecture of neural network.
arXiv Detail & Related papers (2022-06-19T00:44:22Z) - Scientific Machine Learning through Physics-Informed Neural Networks:
Where we are and What's next [5.956366179544257]
Physic-Informed Neural Networks (PINN) are neural networks (NNs) that encode model equations.
PINNs are nowadays used to solve PDEs, fractional equations, and integral-differential equations.
arXiv Detail & Related papers (2022-01-14T19:05:44Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - GrADE: A graph based data-driven solver for time-dependent nonlinear
partial differential equations [0.0]
We propose a novel framework referred to as the Graph Attention Differential Equation (GrADE) for solving time dependent nonlinear PDEs.
The proposed approach couples FNN, graph neural network, and recently developed Neural ODE framework.
Results obtained illustrate the capability of the proposed framework in modeling PDE and its scalability to larger domains without the need for retraining.
arXiv Detail & Related papers (2021-08-24T10:49:03Z) - Physics-informed attention-based neural network for solving non-linear
partial differential equations [6.103365780339364]
Physics-Informed Neural Networks (PINNs) have enabled significant improvements in modelling physical processes.
PINNs are based on simple architectures, and learn the behavior of complex physical systems by optimizing the network parameters to minimize the residual of the underlying PDE.
Here, we address the question of which network architectures are best suited to learn the complex behavior of non-linear PDEs.
arXiv Detail & Related papers (2021-05-17T14:29:08Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.