Accelerating Physics-Informed Neural Network Training with Prior
Dictionaries
- URL: http://arxiv.org/abs/2004.08151v2
- Date: Fri, 29 May 2020 02:10:56 GMT
- Title: Accelerating Physics-Informed Neural Network Training with Prior
Dictionaries
- Authors: Wei Peng, Weien Zhou, Jun Zhang, Wen Yao
- Abstract summary: We propose a variant called Prior Dictionary based Physics-Informed Neural Networks (PD-PINNs)
PD-PINNs enjoy enhanced representation power on the tasks, which helps to capture features provided by dictionaries.
It is proved that under certain mild conditions, the prediction error made by neural networks can be bounded by expected loss of PDEs and boundary conditions.
- Score: 7.035456567972667
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-Informed Neural Networks (PINNs) can be regarded as general-purpose
PDE solvers, but it might be slow to train PINNs on particular problems, and
there is no theoretical guarantee of corresponding error bounds. In this
manuscript, we propose a variant called Prior Dictionary based Physics-Informed
Neural Networks (PD-PINNs). Equipped with task-dependent dictionaries, PD-PINNs
enjoy enhanced representation power on the tasks, which helps to capture
features provided by dictionaries so that the proposed neural networks can
achieve faster convergence in the process of training. In various numerical
simulations, compared with existing PINN methods, combining prior dictionaries
can significantly enhance convergence speed. In terms of theory, we obtain the
error bounds applicable to PINNs and PD-PINNs for solving elliptic partial
differential equations of second order. It is proved that under certain mild
conditions, the prediction error made by neural networks can be bounded by
expected loss of PDEs and boundary conditions.
Related papers
- Element-wise Multiplication Based Deeper Physics-Informed Neural Networks [1.8554335256160261]
PINNs are a promising framework for resolving partial differential equations (PDEs)
Lack of expressive ability and pathology issues are found to prevent the application of PINNs in complex PDEs.
We propose Deeper Physics-Informed Neural Network (Deeper-PINN) to resolve these issues.
arXiv Detail & Related papers (2024-06-06T15:27:52Z) - PINNsFormer: A Transformer-Based Framework For Physics-Informed Neural Networks [22.39904196850583]
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep learning framework for approximating numerical solutions to partial differential equations (PDEs)
We introduce a novel Transformer-based framework, termed PINNsFormer, designed to address this limitation.
PINNsFormer achieves superior generalization ability and accuracy across various scenarios, including PINNs failure modes and high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-21T18:06:27Z) - RBF-MGN:Solving spatiotemporal PDEs with Physics-informed Graph Neural
Network [4.425915683879297]
We propose a novel framework based on graph neural networks (GNNs) and radial basis function finite difference (RBF-FD)
RBF-FD is used to construct a high-precision difference format of the differential equations to guide model training.
We illustrate the generalizability, accuracy, and efficiency of the proposed algorithms on different PDE parameters.
arXiv Detail & Related papers (2022-12-06T10:08:02Z) - Replacing Automatic Differentiation by Sobolev Cubatures fastens Physics
Informed Neural Nets and strengthens their Approximation Power [0.6091702876917279]
We present a novel class of approximations for variational losses, being applicable for the training of physics-informed neural nets (PINNs)
The loss computation rests on an extension of Gauss-Legendre cubatures, we term Sobolev cubatures, replacing automatic differentiation (A.D.)
arXiv Detail & Related papers (2022-11-23T11:23:08Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Scientific Machine Learning through Physics-Informed Neural Networks:
Where we are and What's next [5.956366179544257]
Physic-Informed Neural Networks (PINN) are neural networks (NNs) that encode model equations.
PINNs are nowadays used to solve PDEs, fractional equations, and integral-differential equations.
arXiv Detail & Related papers (2022-01-14T19:05:44Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.