Error analysis for physics informed neural networks (PINNs)
approximating Kolmogorov PDEs
- URL: http://arxiv.org/abs/2106.14473v1
- Date: Mon, 28 Jun 2021 08:37:56 GMT
- Title: Error analysis for physics informed neural networks (PINNs)
approximating Kolmogorov PDEs
- Authors: Tim De Ryck and Siddhartha Mishra
- Abstract summary: We derive rigorous bounds on the error incurred by PINNs in approximating the solutions of a large class of parabolic PDEs.
We construct neural networks, whose PINN residual (generalization error) can be made as small as desired.
These results enable us to provide a comprehensive error analysis for PINNs in approximating Kolmogorov PDEs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics informed neural networks approximate solutions of PDEs by minimizing
pointwise residuals. We derive rigorous bounds on the error, incurred by PINNs
in approximating the solutions of a large class of linear parabolic PDEs,
namely Kolmogorov equations that include the heat equation and Black-Scholes
equation of option pricing, as examples. We construct neural networks, whose
PINN residual (generalization error) can be made as small as desired. We also
prove that the total $L^2$-error can be bounded by the generalization error,
which in turn is bounded in terms of the training error, provided that a
sufficient number of randomly chosen training (collocation) points is used.
Moreover, we prove that the size of the PINNs and the number of training
samples only grow polynomially with the underlying dimension, enabling PINNs to
overcome the curse of dimensionality in this context. These results enable us
to provide a comprehensive error analysis for PINNs in approximating Kolmogorov
PDEs.
Related papers
- RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - Solving PDEs on Spheres with Physics-Informed Convolutional Neural Networks [17.69666422395703]
Physics-informed neural networks (PINNs) have been demonstrated to be efficient in solving partial differential equations (PDEs)
In this paper, we establish rigorous analysis of the physics-informed convolutional neural network (PICNN) for solving PDEs on the sphere.
arXiv Detail & Related papers (2023-08-18T14:58:23Z) - Physics-Informed Neural Network Method for Parabolic Differential
Equations with Sharply Perturbed Initial Conditions [68.8204255655161]
We develop a physics-informed neural network (PINN) model for parabolic problems with a sharply perturbed initial condition.
Localized large gradients in the ADE solution make the (common in PINN) Latin hypercube sampling of the equation's residual highly inefficient.
We propose criteria for weights in the loss function that produce a more accurate PINN solution than those obtained with the weights selected via other methods.
arXiv Detail & Related papers (2022-08-18T05:00:24Z) - Physics-Aware Neural Networks for Boundary Layer Linear Problems [0.0]
Physics-Informed Neural Networks (PINNs) approximate the solution of general partial differential equations (PDEs) by adding them in some form as terms of the loss/cost of a Neural Network.
This paper explores PINNs for linear PDEs whose solutions may present one or more boundary layers.
arXiv Detail & Related papers (2022-07-15T21:15:06Z) - Enforcing Continuous Physical Symmetries in Deep Learning Network for
Solving Partial Differential Equations [3.6317085868198467]
We introduce a new method, symmetry-enhanced physics informed neural network (SPINN) where the invariant surface conditions induced by the Lie symmetries of PDEs are embedded into the loss function of PINN.
We show that SPINN performs better than PINN with fewer training points and simpler architecture of neural network.
arXiv Detail & Related papers (2022-06-19T00:44:22Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Solving PDEs on Unknown Manifolds with Machine Learning [8.220217498103315]
This paper presents a mesh-free computational framework and machine learning theory for solving elliptic PDEs on unknown manifold.
We show that the proposed NN solver can robustly generalize the PDE on new data points with errors that are almost identical to generalizations on new data points.
arXiv Detail & Related papers (2021-06-12T03:55:15Z) - Parametric Complexity Bounds for Approximating PDEs with Neural Networks [41.46028070204925]
We prove that when a PDE's coefficients are representable by small neural networks, the parameters required to approximate its solution scalely with the input $d$ are proportional to the parameter counts of the neural networks.
Our proof is based on constructing a neural network which simulates gradient descent in an appropriate space which converges to the solution of the PDE.
arXiv Detail & Related papers (2021-03-03T02:42:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.