Understanding and Mitigating Extrapolation Failures in Physics-Informed
Neural Networks
- URL: http://arxiv.org/abs/2306.09478v2
- Date: Sun, 26 Nov 2023 20:01:28 GMT
- Title: Understanding and Mitigating Extrapolation Failures in Physics-Informed
Neural Networks
- Authors: Lukas Fesser, Luca D'Amico-Wong, Richard Qiu
- Abstract summary: We study the extrapolation behavior of PINNs on a representative set of PDEs of different types.
We find that failure to extrapolate is not caused by high frequencies in the solution function, but rather by shifts in the support of the Fourier spectrum over time.
- Score: 1.1510009152620668
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed Neural Networks (PINNs) have recently gained popularity due
to their effective approximation of partial differential equations (PDEs) using
deep neural networks (DNNs). However, their out of domain behavior is not well
understood, with previous work speculating that the presence of high frequency
components in the solution function might be to blame for poor extrapolation
performance. In this paper, we study the extrapolation behavior of PINNs on a
representative set of PDEs of different types, including high-dimensional PDEs.
We find that failure to extrapolate is not caused by high frequencies in the
solution function, but rather by shifts in the support of the Fourier spectrum
over time. We term these spectral shifts and quantify them by introducing a
Weighted Wasserstein-Fourier distance (WWF). We show that the WWF can be used
to predict PINN extrapolation performance, and that in the absence of
significant spectral shifts, PINN predictions stay close to the true solution
even in extrapolation. Finally, we propose a transfer learning-based strategy
to mitigate the effects of larger spectral shifts, which decreases
extrapolation errors by up to 82%.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Physics-embedded Fourier Neural Network for Partial Differential Equations [35.41134465442465]
We introduce Physics-embedded Fourier Neural Networks (PeFNN) with flexible and explainable error.
PeFNN is designed to enforce momentum conservation and yields interpretable nonlinear expressions.
We demonstrate its outstanding performance for challenging real-world applications such as large-scale flood simulations.
arXiv Detail & Related papers (2024-07-15T18:30:39Z) - RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - Toward a Better Understanding of Fourier Neural Operators from a Spectral Perspective [4.315136713224842]
SpecB-FNO achieves better prediction accuracy on diverse PDE applications, with an average improvement of 50%.
This paper offers empirical insights into FNO's difficulty with large kernels through spectral analysis.
arXiv Detail & Related papers (2024-04-10T17:58:04Z) - Parametric Encoding with Attention and Convolution Mitigate Spectral Bias of Neural Partial Differential Equation Solvers [0.0]
Parametric Grid Convolutional Attention Networks (PGCANs) are used to solve partial differential equations (PDEs)
PGCANs parameterize the input space with a grid-based encoder whose parameters are connected to the output via a DNN decoder.
Our encoder provides a localized learning ability and uses convolution layers to avoid overfitting and improve information propagation rate from the boundaries to the interior of the domain.
arXiv Detail & Related papers (2024-03-22T23:56:40Z) - Deep Neural Networks Tend To Extrapolate Predictably [51.303814412294514]
neural network predictions tend to be unpredictable and overconfident when faced with out-of-distribution (OOD) inputs.
We observe that neural network predictions often tend towards a constant value as input data becomes increasingly OOD.
We show how one can leverage our insights in practice to enable risk-sensitive decision-making in the presence of OOD inputs.
arXiv Detail & Related papers (2023-10-02T03:25:32Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs [86.35471039808023]
We introduce the Incremental Fourier Neural Operator (iFNO), which progressively increases the number of frequency modes used by the model.
We show that iFNO reduces total training time while maintaining or improving generalization performance across various datasets.
Our method demonstrates a 10% lower testing error, using 20% fewer frequency modes compared to the existing Fourier Neural Operator, while also achieving a 30% faster training.
arXiv Detail & Related papers (2022-11-28T09:57:15Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Physical Activation Functions (PAFs): An Approach for More Efficient
Induction of Physics into Physics-Informed Neural Networks (PINNs) [0.0]
Physical Activation Functions (PAFs) help to generate Physics-Informed Neural Networks (PINNs) with less complexity and much more validity for longer ranges of prediction.
PAFs can be inspired by any mathematical formula related to the investigating phenomena such as the initial or boundary conditions of the PDE system.
It is concluded that using the PAFs helps in generating PINNs with less complexity and much more validity for longer ranges of prediction.
arXiv Detail & Related papers (2022-05-29T11:26:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.