Preconditioning for Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2402.00531v1
- Date: Thu, 1 Feb 2024 11:58:28 GMT
- Title: Preconditioning for Physics-Informed Neural Networks
- Authors: Songming Liu, Chang Su, Jiachen Yao, Zhongkai Hao, Hang Su, Youjia Wu,
Jun Zhu
- Abstract summary: We propose to use condition number as a metric to diagnose and mitigate the pathologies in PINNs.
We prove theorems to reveal how condition number is related to both the error control and convergence of PINNs.
We present an algorithm that leverages preconditioning to improve the condition number.
- Score: 25.697465351286564
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) have shown promise in solving
various partial differential equations (PDEs). However, training pathologies
have negatively affected the convergence and prediction accuracy of PINNs,
which further limits their practical applications. In this paper, we propose to
use condition number as a metric to diagnose and mitigate the pathologies in
PINNs. Inspired by classical numerical analysis, where the condition number
measures sensitivity and stability, we highlight its pivotal role in the
training dynamics of PINNs. We prove theorems to reveal how condition number is
related to both the error control and convergence of PINNs. Subsequently, we
present an algorithm that leverages preconditioning to improve the condition
number. Evaluations of 18 PDE problems showcase the superior performance of our
method. Significantly, in 7 of these problems, our method reduces errors by an
order of magnitude. These empirical findings verify the critical role of the
condition number in PINNs' training.
Related papers
- Numerical analysis of physics-informed neural networks and related
models in physics-informed machine learning [18.1180892910779]
Physics-informed neural networks (PINNs) have been very popular in recent years as algorithms for the numerical simulation of both forward and inverse problems for partial differential equations.
We provide a unified framework in which analysis of the various components of the error incurred by PINNs in approximating PDEs can be effectively carried out.
arXiv Detail & Related papers (2024-01-30T10:43:27Z) - Correcting model misspecification in physics-informed neural networks
(PINNs) [2.07180164747172]
We present a general approach to correct the misspecified physical models in PINNs for discovering governing equations.
We employ other deep neural networks (DNNs) to model the discrepancy between the imperfect models and the observational data.
We envision that the proposed approach will extend the applications of PINNs for discovering governing equations in problems where the physico-chemical or biological processes are not well understood.
arXiv Detail & Related papers (2023-10-16T19:25:52Z) - Neural tangent kernel analysis of PINN for advection-diffusion equation [0.0]
Physics-informed neural networks (PINNs) numerically approximate the solution of a partial differential equation (PDE)
PINNs are known to struggle even in simple cases where the closed-form analytical solution is available.
This work focuses on a systematic analysis of PINNs for the linear advection-diffusion equation (LAD) using the Neural Tangent Kernel (NTK) theory.
arXiv Detail & Related papers (2022-11-21T18:35:14Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Physical Activation Functions (PAFs): An Approach for More Efficient
Induction of Physics into Physics-Informed Neural Networks (PINNs) [0.0]
Physical Activation Functions (PAFs) help to generate Physics-Informed Neural Networks (PINNs) with less complexity and much more validity for longer ranges of prediction.
PAFs can be inspired by any mathematical formula related to the investigating phenomena such as the initial or boundary conditions of the PDE system.
It is concluded that using the PAFs helps in generating PINNs with less complexity and much more validity for longer ranges of prediction.
arXiv Detail & Related papers (2022-05-29T11:26:46Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Robust Learning of Physics Informed Neural Networks [2.86989372262348]
Physics-informed Neural Networks (PINNs) have been shown to be effective in solving partial differential equations.
This paper shows that a PINN can be sensitive to errors in training data and overfit itself in dynamically propagating these errors over the domain of the solution of the PDE.
arXiv Detail & Related papers (2021-10-26T00:10:57Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.