A Dimension-Augmented Physics-Informed Neural Network (DaPINN) with High
Level Accuracy and Efficiency
- URL: http://arxiv.org/abs/2210.13212v1
- Date: Wed, 19 Oct 2022 15:54:37 GMT
- Title: A Dimension-Augmented Physics-Informed Neural Network (DaPINN) with High
Level Accuracy and Efficiency
- Authors: Weilong Guan, Kaihan Yang, Yinsheng Chen, Zhong Guan
- Abstract summary: Physics-informed neural networks (PINNs) have been widely applied in different fields.
We propose a novel dimension-augmented physics-informed neural network (DaPINN)
DaPINN simultaneously and significantly improves the accuracy and efficiency of the PINN.
- Score: 0.20391237204597357
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) have been widely applied in
different fields due to their effectiveness in solving partial differential
equations (PDEs). However, the accuracy and efficiency of PINNs need to be
considerably improved for scientific and commercial use. To address this issue,
we systematically propose a novel dimension-augmented physics-informed neural
network (DaPINN), which simultaneously and significantly improves the accuracy
and efficiency of the PINN. In the DaPINN model, we introduce inductive bias in
the neural network to enhance network generalizability by adding a special
regularization term to the loss function. Furthermore, we manipulate the
network input dimension by inserting additional sample features and
incorporating the expanded dimensionality in the loss function. Moreover, we
verify the effectiveness of power series augmentation, Fourier series
augmentation and replica augmentation, in both forward and backward problems.
In most experiments, the error of DaPINN is 1$\sim$2 orders of magnitude lower
than that of PINN. The results show that the DaPINN outperforms the original
PINN in terms of both accuracy and efficiency with a reduced dependence on the
number of sample points. We also discuss the complexity of the DaPINN and its
compatibility with other methods.
Related papers
- Densely Multiplied Physics Informed Neural Networks [1.8554335256160261]
physics-informed neural networks (PINNs) have shown great potential in dealing with nonlinear partial differential equations (PDEs)
This paper improves the neural network architecture to improve the performance of PINN.
We propose a densely multiply PINN (DM-PINN) architecture, which multiplies the output of a hidden layer with the outputs of all the behind hidden layers.
arXiv Detail & Related papers (2024-02-06T20:45:31Z) - Artificial to Spiking Neural Networks Conversion for Scientific Machine
Learning [24.799635365988905]
We introduce a method to convert Physics-Informed Neural Networks (PINNs) to Spiking Neural Networks (SNNs)
SNNs are expected to have higher energy efficiency compared to traditional Artificial Neural Networks (ANNs)
arXiv Detail & Related papers (2023-08-31T00:21:27Z) - Benign Overfitting in Deep Neural Networks under Lazy Training [72.28294823115502]
We show that when the data distribution is well-separated, DNNs can achieve Bayes-optimal test error for classification.
Our results indicate that interpolating with smoother functions leads to better generalization.
arXiv Detail & Related papers (2023-05-30T19:37:44Z) - MRF-PINN: A Multi-Receptive-Field convolutional physics-informed neural
network for solving partial differential equations [6.285167805465505]
Physics-informed neural networks (PINN) can achieve lower development and solving cost than traditional partial differential equation (PDE) solvers.
Due to the advantages of parameter sharing, spatial feature extraction and low inference cost, convolutional neural networks (CNN) are increasingly used in PINN.
arXiv Detail & Related papers (2022-09-06T12:26:22Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Enforcing Continuous Physical Symmetries in Deep Learning Network for
Solving Partial Differential Equations [3.6317085868198467]
We introduce a new method, symmetry-enhanced physics informed neural network (SPINN) where the invariant surface conditions induced by the Lie symmetries of PDEs are embedded into the loss function of PINN.
We show that SPINN performs better than PINN with fewer training points and simpler architecture of neural network.
arXiv Detail & Related papers (2022-06-19T00:44:22Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-informed Neural Network for Nonlinear Dynamics in Fiber Optics [10.335960060544904]
A physics-informed neural network (PINN) that combines deep learning with physics is studied to solve the nonlinear Schr"odinger equation for learning nonlinear dynamics in fiber optics.
PINN is not only an effective partial differential equation solver, but also a prospective technique to advance the scientific computing and automatic modeling in fiber optics.
arXiv Detail & Related papers (2021-09-01T12:19:32Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.