MRF-PINN: A Multi-Receptive-Field convolutional physics-informed neural
network for solving partial differential equations
- URL: http://arxiv.org/abs/2209.03151v1
- Date: Tue, 6 Sep 2022 12:26:22 GMT
- Title: MRF-PINN: A Multi-Receptive-Field convolutional physics-informed neural
network for solving partial differential equations
- Authors: Shihong Zhang and Chi Zhang and Bosen Wang
- Abstract summary: Physics-informed neural networks (PINN) can achieve lower development and solving cost than traditional partial differential equation (PDE) solvers.
Due to the advantages of parameter sharing, spatial feature extraction and low inference cost, convolutional neural networks (CNN) are increasingly used in PINN.
- Score: 6.285167805465505
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINN) can achieve lower development and
solving cost than traditional partial differential equation (PDE) solvers in
scenarios such as reconstructing the physics field and solving the inverse
problem. Due to the advantages of parameter sharing, spatial feature extraction
and low inference cost, convolutional neural networks (CNN) are increasingly
used in PINN. To adapt convolutional PINN to different equations, researchers
have to spend much time tuning critical hyperparameters. Furthermore, the
effects of finite difference accuracy, model complexity, and mesh resolution on
the prediction result of convolutional PINN are unclear. To fill the above
research gaps, in this paper, (1) A Multi-Receptive-Field PINN (MRF-PINN) model
is constructed to adapt different equation types and mesh resolutions without
manual tuning.(2) The generality and advantages of the MRF-PINN are verified in
three typical linear PDEs (elliptic, parabolic, hyperbolic) and nonlinear PDEs
(Navier-Stokes equations). (3) The contribution of each receptive field to the
final MRF-PINN result is analyzed, and the influence of finite difference
accuracy, model complexity (channel number) and mesh resolution on the MRF-PINN
result is tested. This paper shows that MRF-PINN can adapt to completely
different equation types and mesh resolutions without any hyperparameter
tuning. Further, the solving error is significantly decreased under high-order
finite difference, large channel number, and high mesh resolution, which is
expected to become a general convolutional PINN scheme.
Related papers
- Functional Tensor Decompositions for Physics-Informed Neural Networks [8.66932181641177]
Physics-Informed Neural Networks (PINNs) have shown continuous and increasing promise in approximating partial differential equations (PDEs)
We propose a generalized PINN version of the classical variable separable method.
Our methodology significantly enhances the performance of PINNs, as evidenced by improved results on complex high-dimensional PDEs.
arXiv Detail & Related papers (2024-08-23T14:24:43Z) - RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - Densely Multiplied Physics Informed Neural Networks [1.8554335256160261]
physics-informed neural networks (PINNs) have shown great potential in dealing with nonlinear partial differential equations (PDEs)
This paper improves the neural network architecture to improve the performance of PINN.
We propose a densely multiply PINN (DM-PINN) architecture, which multiplies the output of a hidden layer with the outputs of all the behind hidden layers.
arXiv Detail & Related papers (2024-02-06T20:45:31Z) - Grad-Shafranov equilibria via data-free physics informed neural networks [0.0]
We show that PINNs can accurately and effectively solve the Grad-Shafranov equation with several different boundary conditions.
We introduce a parameterized PINN framework, expanding the input space to include variables such as pressure, aspect ratio, elongation, and triangularity.
arXiv Detail & Related papers (2023-11-22T16:08:38Z) - PINNsFormer: A Transformer-Based Framework For Physics-Informed Neural Networks [22.39904196850583]
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep learning framework for approximating numerical solutions to partial differential equations (PDEs)
We introduce a novel Transformer-based framework, termed PINNsFormer, designed to address this limitation.
PINNsFormer achieves superior generalization ability and accuracy across various scenarios, including PINNs failure modes and high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-21T18:06:27Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Finite Basis Physics-Informed Neural Networks (FBPINNs): a scalable
domain decomposition approach for solving differential equations [20.277873724720987]
We propose a new, scalable approach for solving large problems relating to differential equations called Finite Basis PINNs (FBPINNs)
FBPINNs are inspired by classical finite element methods, where the solution of the differential equation is expressed as the sum of a finite set of basis functions with compact support.
In FBPINNs neural networks are used to learn these basis functions, which are defined over small, overlapping subdomain problems.
arXiv Detail & Related papers (2021-07-16T13:03:47Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.