Investigating and Mitigating Failure Modes in Physics-informed Neural
Networks (PINNs)
- URL: http://arxiv.org/abs/2209.09988v3
- Date: Mon, 8 May 2023 00:04:06 GMT
- Title: Investigating and Mitigating Failure Modes in Physics-informed Neural
Networks (PINNs)
- Authors: Shamsulhaq Basir
- Abstract summary: This paper explores the difficulties in solving partial differential equations (PDEs) using physics-informed neural networks (PINNs)
PINNs use physics as a regularization term in objective function. However, this approach is impractical in the absence of data or prior knowledge of the solution.
Our findings demonstrate that high-order PDEs contaminate backpropagated gradients and hinder convergence.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper explores the difficulties in solving partial differential
equations (PDEs) using physics-informed neural networks (PINNs). PINNs use
physics as a regularization term in the objective function. However, a drawback
of this approach is the requirement for manual hyperparameter tuning, making it
impractical in the absence of validation data or prior knowledge of the
solution. Our investigations of the loss landscapes and backpropagated
gradients in the presence of physics reveal that existing methods produce
non-convex loss landscapes that are hard to navigate. Our findings demonstrate
that high-order PDEs contaminate backpropagated gradients and hinder
convergence. To address these challenges, we introduce a novel method that
bypasses the calculation of high-order derivative operators and mitigates the
contamination of backpropagated gradients. Consequently, we reduce the
dimension of the search space and make learning PDEs with non-smooth solutions
feasible. Our method also provides a mechanism to focus on complex regions of
the domain. Besides, we present a dual unconstrained formulation based on
Lagrange multiplier method to enforce equality constraints on the model's
prediction, with adaptive and independent learning rates inspired by adaptive
subgradient methods. We apply our approach to solve various linear and
non-linear PDEs.
Related papers
- Beyond Derivative Pathology of PINNs: Variable Splitting Strategy with Convergence Analysis [6.468495781611434]
Physics-informed neural networks (PINNs) have emerged as effective methods for solving partial differential equations (PDEs) in various problems.
In this study, we prove that PINNs encounter a fundamental issue that the premise is invalid.
We propose a textitvariable splitting strategy that addresses this issue by parameterizing the gradient of the solution as an auxiliary variable.
arXiv Detail & Related papers (2024-09-30T15:20:10Z) - Constrained or Unconstrained? Neural-Network-Based Equation Discovery from Data [0.0]
We represent the PDE as a neural network and use an intermediate state representation similar to a Physics-Informed Neural Network (PINN)
We present a penalty method and a widely used trust-region barrier method to solve this constrained optimization problem.
Our results on the Burgers' and the Korteweg-De Vreis equations demonstrate that the latter constrained method outperforms the penalty method.
arXiv Detail & Related papers (2024-05-30T01:55:44Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Mitigating Learning Complexity in Physics and Equality Constrained
Artificial Neural Networks [0.9137554315375919]
Physics-informed neural networks (PINNs) have been proposed to learn the solution of partial differential equations (PDE)
In PINNs, the residual form of the PDE of interest and its boundary conditions are lumped into a composite objective function as soft penalties.
Here, we show that this specific way of formulating the objective function is the source of severe limitations in the PINN approach when applied to different kinds of PDEs.
arXiv Detail & Related papers (2022-06-19T04:12:01Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Physics and Equality Constrained Artificial Neural Networks: Application
to Partial Differential Equations [1.370633147306388]
Physics-informed neural networks (PINNs) have been proposed to learn the solution of partial differential equations (PDE)
Here, we show that this specific way of formulating the objective function is the source of severe limitations in the PINN approach.
We propose a versatile framework that can tackle both inverse and forward problems.
arXiv Detail & Related papers (2021-09-30T05:55:35Z) - Solving PDEs on Unknown Manifolds with Machine Learning [8.220217498103315]
This paper presents a mesh-free computational framework and machine learning theory for solving elliptic PDEs on unknown manifold.
We show that the proposed NN solver can robustly generalize the PDE on new data points with errors that are almost identical to generalizations on new data points.
arXiv Detail & Related papers (2021-06-12T03:55:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.