ST-PINN: A Self-Training Physics-Informed Neural Network for Partial
Differential Equations
- URL: http://arxiv.org/abs/2306.09389v1
- Date: Thu, 15 Jun 2023 15:49:13 GMT
- Title: ST-PINN: A Self-Training Physics-Informed Neural Network for Partial
Differential Equations
- Authors: Junjun Yan, Xinhai Chen, Zhichao Wang, Enqiang Zhoui and Jie Liu
- Abstract summary: Partial differential equations (PDEs) are an essential computational kernel in physics and engineering.
With the advance of deep learning, physics-informed neural networks (PINNs) have shown great potential for fast PDE solving in various applications.
We propose a self-training physics-informed neural network, ST-PINN, to address the issue of low accuracy and convergence problems of existing PINNs.
- Score: 13.196871939441273
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Partial differential equations (PDEs) are an essential computational kernel
in physics and engineering. With the advance of deep learning, physics-informed
neural networks (PINNs), as a mesh-free method, have shown great potential for
fast PDE solving in various applications. To address the issue of low accuracy
and convergence problems of existing PINNs, we propose a self-training
physics-informed neural network, ST-PINN. Specifically, ST-PINN introduces a
pseudo label based self-learning algorithm during training. It employs
governing equation as the pseudo-labeled evaluation index and selects the
highest confidence examples from the sample points to attach the pseudo labels.
To our best knowledge, we are the first to incorporate a self-training
mechanism into physics-informed learning. We conduct experiments on five PDE
problems in different fields and scenarios. The results demonstrate that the
proposed method allows the network to learn more physical information and
benefit convergence. The ST-PINN outperforms existing physics-informed neural
network methods and improves the accuracy by a factor of 1.33x-2.54x. The code
of ST-PINN is available at GitHub: https://github.com/junjun-yan/ST-PINN.
Related papers
- Improving PINNs By Algebraic Inclusion of Boundary and Initial Conditions [0.1874930567916036]
"AI for Science" aims to solve fundamental scientific problems using AI techniques.
In this work we explore the possibility of changing the model being trained from being just a neural network to being a non-linear transformation of it.
This reduces the number of terms in the loss function than the standard PINN losses.
arXiv Detail & Related papers (2024-07-30T11:19:48Z) - Improved physics-informed neural network in mitigating gradient related failures [11.356695216531328]
Physics-informed neural networks (PINNs) integrate fundamental physical principles with advanced data-driven techniques.
PINNs face persistent challenges with stiffness in gradient flow, which limits their predictive capabilities.
This paper presents an improved PINN to mitigate gradient-related failures.
arXiv Detail & Related papers (2024-07-28T07:58:10Z) - iPINNs: Incremental learning for Physics-informed neural networks [66.4795381419701]
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs)
We propose incremental PINNs that can learn multiple tasks sequentially without additional parameters for new tasks and improve performance for every equation in the sequence.
Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learnedworks.
arXiv Detail & Related papers (2023-04-10T20:19:20Z) - Ensemble learning for Physics Informed Neural Networks: a Gradient Boosting approach [10.250994619846416]
We present a new training paradigm referred to as "gradient boosting" (GB)
Instead of learning the solution of a given PDE using a single neural network directly, our algorithm employs a sequence of neural networks to achieve a superior outcome.
This work also unlocks the door to employing ensemble learning techniques in PINNs.
arXiv Detail & Related papers (2023-02-25T19:11:44Z) - SVD-PINNs: Transfer Learning of Physics-Informed Neural Networks via Singular Value Decomposition [24.422082821785487]
One neural network corresponds to one partial differential equations.
In practice, we usually need to solve a class of PDEs, not just one.
We propose a transfer learning method of PINNs via keeping singular vectors and optimizing singular values.
arXiv Detail & Related papers (2022-11-16T08:46:10Z) - FO-PINNs: A First-Order formulation for Physics Informed Neural Networks [1.8874301050354767]
Physics-Informed Neural Networks (PINNs) are a class of deep learning neural networks that learn the response of a physical system without any simulation data.
PINNs are successfully used for solving forward and inverse problems, but their accuracy decreases significantly for parameterized systems.
We present first-order physics-informed neural networks (FO-PINNs) that are trained using a first-order formulation of the PDE loss function.
arXiv Detail & Related papers (2022-10-25T20:25:33Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.