Failure-informed adaptive sampling for PINNs
- URL: http://arxiv.org/abs/2210.00279v1
- Date: Sat, 1 Oct 2022 13:34:41 GMT
- Title: Failure-informed adaptive sampling for PINNs
- Authors: Zhiwei Gao, Liang Yan, Tao Zhou
- Abstract summary: Physics-informed neural networks (PINNs) have emerged as an effective technique for solving PDEs in a wide range of domains.
Recent research has demonstrated, however, that the performance of PINNs can vary dramatically with different sampling procedures.
We present an adaptive approach termed failure-informed PINNs, which is inspired by the viewpoint of reliability analysis.
- Score: 5.723850818203907
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Physics-informed neural networks (PINNs) have emerged as an effective
technique for solving PDEs in a wide range of domains. Recent research has
demonstrated, however, that the performance of PINNs can vary dramatically with
different sampling procedures, and that using a fixed set of training points
can be detrimental to the convergence of PINNs to the correct solution. In this
paper, we present an adaptive approach termed failure-informed PINNs(FI-PINNs),
which is inspired by the viewpoint of reliability analysis. The basic idea is
to define a failure probability by using the residual, which represents the
reliability of the PINNs. With the aim of placing more samples in the failure
region and fewer samples in the safe region, FI-PINNs employs a
failure-informed enrichment technique to incrementally add new collocation
points to the training set adaptively. Using the new collocation points, the
accuracy of the PINNs model is then improved. The failure probability, similar
to classical adaptive finite element methods, acts as an error indicator that
guides the refinement of the training set. When compared to the conventional
PINNs method and the residual-based adaptive refinement method, the developed
algorithm can significantly improve accuracy, especially for low regularity and
high-dimensional problems. We prove rigorous bounds on the error incurred by
the proposed FI-PINNs and illustrate its performance through several problems.
Related papers
- Physics-Informed Neural Networks with Trust-Region Sequential Quadratic Programming [4.557963624437784]
Recent research has noted that Physics-Informed Neural Networks (PINNs) may fail to learn relatively complex Partial Differential Equations (PDEs)
This paper addresses the failure modes of PINNs by introducing a novel, hard-constrained deep learning method -- trust-region Sequential Quadratic Programming (trSQP-PINN)
In contrast to directly training the penalized soft-constrained loss as in PINNs, our method performs a linear-quadratic approximation of the hard-constrained loss, while leveraging the soft-constrained loss to adaptively adjust the trust-region radius.
arXiv Detail & Related papers (2024-09-16T23:22:12Z) - RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - PINNsFormer: A Transformer-Based Framework For Physics-Informed Neural Networks [22.39904196850583]
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep learning framework for approximating numerical solutions to partial differential equations (PDEs)
We introduce a novel Transformer-based framework, termed PINNsFormer, designed to address this limitation.
PINNsFormer achieves superior generalization ability and accuracy across various scenarios, including PINNs failure modes and high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-21T18:06:27Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - A Novel Adaptive Causal Sampling Method for Physics-Informed Neural
Networks [35.25394937917774]
Informed Neural Networks (PINNs) have become a kind of attractive machine learning method for obtaining solutions of partial differential equations (PDEs)
We introduce temporal causality into adaptive sampling and propose a novel adaptive causal sampling method to improve the performance and efficiency of PINs.
We demonstrate that by utilizing such a relatively simple sampling method, prediction performance can be improved up to two orders of magnitude compared with state-of-the-art results.
arXiv Detail & Related papers (2022-10-24T01:51:08Z) - Adaptive Self-supervision Algorithms for Physics-informed Neural
Networks [59.822151945132525]
Physics-informed neural networks (PINNs) incorporate physical knowledge from the problem domain as a soft constraint on the loss function.
We study the impact of the location of the collocation points on the trainability of these models.
We propose a novel adaptive collocation scheme which progressively allocates more collocation points to areas where the model is making higher errors.
arXiv Detail & Related papers (2022-07-08T18:17:06Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - CAN-PINN: A Fast Physics-Informed Neural Network Based on
Coupled-Automatic-Numerical Differentiation Method [17.04611875126544]
Novel physics-informed neural network (PINN) methods for coupling neighboring support points and automatic differentiation (AD) through Taylor series expansion are proposed.
The proposed coupled-automatic-numerical differentiation framework, labeled as can-PINN, unifies the advantages of AD and ND, providing more robust and efficient training than AD-based PINNs.
arXiv Detail & Related papers (2021-10-29T14:52:46Z) - Efficient training of physics-informed neural networks via importance
sampling [2.9005223064604078]
Physics-In Neural Networks (PINNs) are a class of deep neural networks that are trained to compute systems governed by partial differential equations (PDEs)
We show that an importance sampling approach will improve the convergence behavior of PINNs training.
arXiv Detail & Related papers (2021-04-26T02:45:10Z) - A Biased Graph Neural Network Sampler with Near-Optimal Regret [57.70126763759996]
Graph neural networks (GNN) have emerged as a vehicle for applying deep network architectures to graph and relational data.
In this paper, we build upon existing work and treat GNN neighbor sampling as a multi-armed bandit problem.
We introduce a newly-designed reward function that introduces some degree of bias designed to reduce variance and avoid unstable, possibly-unbounded payouts.
arXiv Detail & Related papers (2021-03-01T15:55:58Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.