Numerical analysis of physics-informed neural networks and related
models in physics-informed machine learning
- URL: http://arxiv.org/abs/2402.10926v1
- Date: Tue, 30 Jan 2024 10:43:27 GMT
- Title: Numerical analysis of physics-informed neural networks and related
models in physics-informed machine learning
- Authors: Tim De Ryck and Siddhartha Mishra
- Abstract summary: Physics-informed neural networks (PINNs) have been very popular in recent years as algorithms for the numerical simulation of both forward and inverse problems for partial differential equations.
We provide a unified framework in which analysis of the various components of the error incurred by PINNs in approximating PDEs can be effectively carried out.
- Score: 18.1180892910779
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) and their variants have been very
popular in recent years as algorithms for the numerical simulation of both
forward and inverse problems for partial differential equations. This article
aims to provide a comprehensive review of currently available results on the
numerical analysis of PINNs and related models that constitute the backbone of
physics-informed machine learning. We provide a unified framework in which
analysis of the various components of the error incurred by PINNs in
approximating PDEs can be effectively carried out. A detailed review of
available results on approximation, generalization and training errors and
their behavior with respect to the type of the PDE and the dimension of the
underlying domain is presented. In particular, the role of the regularity of
the solutions and their stability to perturbations in the error analysis is
elucidated. Numerical results are also presented to illustrate the theory. We
identify training errors as a key bottleneck which can adversely affect the
overall performance of various models in physics-informed machine learning.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Adapting Physics-Informed Neural Networks for Bifurcation Detection in Ecological Migration Models [0.16442870218029523]
In this study, we explore the application of Physics-Informed Neural Networks (PINNs) to the analysis of bifurcation phenomena in ecological migration models.
By integrating the fundamental principles of diffusion-advection-reaction equations with deep learning techniques, we address the complexities of species migration dynamics.
arXiv Detail & Related papers (2024-09-01T08:00:31Z) - Solving Differential Equations using Physics-Informed Deep Equilibrium Models [4.237218036051422]
This paper introduces Physics-Informed Deep Equilibrium Models (PIDEQs) for solving initial value problems (IVPs) of ordinary differential equations (ODEs)
By bridging deep learning and physics-based modeling, this work advances computational techniques for solving IVPs, with implications for scientific computing and engineering applications.
arXiv Detail & Related papers (2024-06-05T17:25:29Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Composing Partial Differential Equations with Physics-Aware Neural
Networks [0.831246680772592]
We introduce a physics-aware neural network (FINN) for learning advection-diffusion processes.
With only one tenth of the number of parameters on average, FINN outperforms machine learning and other state-of-the-art physics-aware models.
arXiv Detail & Related papers (2021-11-23T11:27:13Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Optimization with learning-informed differential equation constraints
and its applications [0.0]
Inspired by applications in optimal control of semilinear elliptic partial differential equations and physics-integrated imaging, differential equation constrained optimization problems are studied.
A particular focus is on the analysis and on numerical methods for problems with machine-learned components.
arXiv Detail & Related papers (2020-08-25T09:05:55Z) - Learning to Simulate Complex Physics with Graph Networks [68.43901833812448]
We present a machine learning framework and model implementation that can learn to simulate a wide variety of challenging physical domains.
Our framework---which we term "Graph Network-based Simulators" (GNS)--represents the state of a physical system with particles, expressed as nodes in a graph, and computes dynamics via learned message-passing.
Our results show that our model can generalize from single-timestep predictions with thousands of particles during training, to different initial conditions, thousands of timesteps, and at least an order of magnitude more particles at test time.
arXiv Detail & Related papers (2020-02-21T16:44:28Z) - A deep learning framework for solution and discovery in solid mechanics [1.4699455652461721]
We present the application of a class of deep learning, known as Physics Informed Neural Networks (PINN), to learning and discovery in solid mechanics.
We explain how to incorporate the momentum balance and elasticity relations into PINN, and explore in detail the application to linear elasticity.
arXiv Detail & Related papers (2020-02-14T08:24:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.