Physics-Informed Neural Networks (PINNs) for Parameterized PDEs: A
Metalearning Approach
- URL: http://arxiv.org/abs/2110.13361v1
- Date: Tue, 26 Oct 2021 02:29:10 GMT
- Title: Physics-Informed Neural Networks (PINNs) for Parameterized PDEs: A
Metalearning Approach
- Authors: Michael Penwarden, Shandian Zhe, Akil Narayan, Robert M. Kirby
- Abstract summary: Physics-informed neural networks (PINNs) are a means of discretizing partial differential equations (PDEs)
We present a survey of model-agnostic metalearning, and then discuss our model-aware metalearning applied to PINNs.
- Score: 13.590496719224987
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) as a means of discretizing partial
differential equations (PDEs) are garnering much attention in the Computational
Science and Engineering (CS&E) world. At least two challenges exist for PINNs
at present: an understanding of accuracy and convergence characteristics with
respect to tunable parameters and identification of optimization strategies
that make PINNs as efficient as other computational science tools. The cost of
PINNs training remains a major challenge of Physics-informed Machine Learning
(PiML) -- and, in fact, machine learning (ML) in general. This paper is meant
to move towards addressing the latter through the study of PINNs for
parameterized PDEs. Following the ML world, we introduce metalearning of PINNs
for parameterized PDEs. By introducing metalearning and transfer learning
concepts, we can greatly accelerate the PINNs optimization process. We present
a survey of model-agnostic metalearning, and then discuss our model-aware
metalearning applied to PINNs. We provide theoretically motivated and
empirically backed assumptions that make our metalearning approach possible. We
then test our approach on various canonical forward parameterized PDEs that
have been presented in the emerging PINNs literature.
Related papers
- Hypernetwork-based Meta-Learning for Low-Rank Physics-Informed Neural
Networks [24.14254861023394]
In this study, we suggest a path that potentially opens up a possibility for physics-informed neural networks (PINNs) to be considered as one such solver.
PINNs have pioneered a proper integration of deep-learning and scientific computing, but they require repetitive time-consuming training of neural networks.
We propose a lightweight low-rank PINNs containing only hundreds of model parameters and an associated hypernetwork-based meta-learning algorithm.
arXiv Detail & Related papers (2023-10-14T08:13:43Z) - PINNsFormer: A Transformer-Based Framework For Physics-Informed Neural Networks [22.39904196850583]
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep learning framework for approximating numerical solutions to partial differential equations (PDEs)
We introduce a novel Transformer-based framework, termed PINNsFormer, designed to address this limitation.
PINNsFormer achieves superior generalization ability and accuracy across various scenarios, including PINNs failure modes and high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-21T18:06:27Z) - iPINNs: Incremental learning for Physics-informed neural networks [66.4795381419701]
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs)
We propose incremental PINNs that can learn multiple tasks sequentially without additional parameters for new tasks and improve performance for every equation in the sequence.
Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learnedworks.
arXiv Detail & Related papers (2023-04-10T20:19:20Z) - AutoPINN: When AutoML Meets Physics-Informed Neural Networks [30.798918516407376]
PINNs enable the estimation of critical parameters, which are unobservable via physical tools, through observable variables.
Existing PINNs are often manually designed, which is time-consuming and may lead to suboptimal performance.
We propose a framework that enables the automated design of PINNs by combining AutoML and PINNs.
arXiv Detail & Related papers (2022-12-08T03:44:08Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.