A nonlocal physics-informed deep learning framework using the
peridynamic differential operator
- URL: http://arxiv.org/abs/2006.00446v1
- Date: Sun, 31 May 2020 06:26:21 GMT
- Title: A nonlocal physics-informed deep learning framework using the
peridynamic differential operator
- Authors: Ehsan Haghighat, Ali Can Bekar, Erdogan Madenci, Ruben Juanes
- Abstract summary: We develop a nonlocal PINN approach using the Peridynamic Differential Operator (PDDO)---a numerical method which incorporates long-range interactions and removes spatial derivatives in the governing equations.
Because the PDDO functions can be readily incorporated in the neural network architecture, the nonlocality does not degrade the performance of modern deep-learning algorithms.
We document the superior behavior of nonlocal PINN with respect to local PINN in both solution accuracy and parameter inference.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Physics-Informed Neural Network (PINN) framework introduced recently
incorporates physics into deep learning, and offers a promising avenue for the
solution of partial differential equations (PDEs) as well as identification of
the equation parameters. The performance of existing PINN approaches, however,
may degrade in the presence of sharp gradients, as a result of the inability of
the network to capture the solution behavior globally. We posit that this
shortcoming may be remedied by introducing long-range (nonlocal) interactions
into the network's input, in addition to the short-range (local) space and time
variables. Following this ansatz, here we develop a nonlocal PINN approach
using the Peridynamic Differential Operator (PDDO)---a numerical method which
incorporates long-range interactions and removes spatial derivatives in the
governing equations. Because the PDDO functions can be readily incorporated in
the neural network architecture, the nonlocality does not degrade the
performance of modern deep-learning algorithms. We apply nonlocal PDDO-PINN to
the solution and identification of material parameters in solid mechanics and,
specifically, to elastoplastic deformation in a domain subjected to indentation
by a rigid punch, for which the mixed displacement--traction boundary condition
leads to localized deformation and sharp gradients in the solution. We document
the superior behavior of nonlocal PINN with respect to local PINN in both
solution accuracy and parameter inference, illustrating its potential for
simulation and discovery of partial differential equations whose solution
develops sharp gradients.
Related papers
- An efficient wavelet-based physics-informed neural networks for singularly perturbed problems [0.0]
Physics-informed neural networks (PINNs) are a class of deep learning models that utilize physics as differential equations.
We present an efficient wavelet-based PINNs model to solve singularly perturbed differential equations.
The architecture allows the training process to search for a solution within wavelet space, making the process faster and more accurate.
arXiv Detail & Related papers (2024-09-18T10:01:37Z) - Grad-Shafranov equilibria via data-free physics informed neural networks [0.0]
We show that PINNs can accurately and effectively solve the Grad-Shafranov equation with several different boundary conditions.
We introduce a parameterized PINN framework, expanding the input space to include variables such as pressure, aspect ratio, elongation, and triangularity.
arXiv Detail & Related papers (2023-11-22T16:08:38Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - A mixed formulation for physics-informed neural networks as a potential
solver for engineering problems in heterogeneous domains: comparison with
finite element method [0.0]
Physics-informed neural networks (PINNs) are capable of finding the solution for a given boundary value problem.
We employ several ideas from the finite element method (FEM) to enhance the performance of existing PINNs in engineering problems.
arXiv Detail & Related papers (2022-06-27T08:18:08Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Robust Learning of Physics Informed Neural Networks [2.86989372262348]
Physics-informed Neural Networks (PINNs) have been shown to be effective in solving partial differential equations.
This paper shows that a PINN can be sensitive to errors in training data and overfit itself in dynamically propagating these errors over the domain of the solution of the PDE.
arXiv Detail & Related papers (2021-10-26T00:10:57Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Physics-Informed Neural Network Method for Solving One-Dimensional
Advection Equation Using PyTorch [0.0]
PINNs approach allows training neural networks while respecting the PDEs as a strong constraint in the optimization.
In standard small-scale circulation simulations, it is shown that the conventional approach incorporates a pseudo diffusive effect that is almost as large as the effect of the turbulent diffusion model.
Of all the schemes tested, only the PINNs approximation accurately predicted the outcome.
arXiv Detail & Related papers (2021-03-15T05:39:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.