Correcting model misspecification in physics-informed neural networks
(PINNs)
- URL: http://arxiv.org/abs/2310.10776v1
- Date: Mon, 16 Oct 2023 19:25:52 GMT
- Title: Correcting model misspecification in physics-informed neural networks
(PINNs)
- Authors: Zongren Zou, Xuhui Meng, George Em Karniadakis
- Abstract summary: We present a general approach to correct the misspecified physical models in PINNs for discovering governing equations.
We employ other deep neural networks (DNNs) to model the discrepancy between the imperfect models and the observational data.
We envision that the proposed approach will extend the applications of PINNs for discovering governing equations in problems where the physico-chemical or biological processes are not well understood.
- Score: 2.07180164747172
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Data-driven discovery of governing equations in computational science has
emerged as a new paradigm for obtaining accurate physical models and as a
possible alternative to theoretical derivations. The recently developed
physics-informed neural networks (PINNs) have also been employed to learn
governing equations given data across diverse scientific disciplines. Despite
the effectiveness of PINNs for discovering governing equations, the physical
models encoded in PINNs may be misspecified in complex systems as some of the
physical processes may not be fully understood, leading to the poor accuracy of
PINN predictions. In this work, we present a general approach to correct the
misspecified physical models in PINNs for discovering governing equations,
given some sparse and/or noisy data. Specifically, we first encode the assumed
physical models, which may be misspecified, then employ other deep neural
networks (DNNs) to model the discrepancy between the imperfect models and the
observational data. Due to the expressivity of DNNs, the proposed method is
capable of reducing the computational errors caused by the model
misspecification and thus enables the applications of PINNs in complex systems
where the physical processes are not exactly known. Furthermore, we utilize the
Bayesian PINNs (B-PINNs) and/or ensemble PINNs to quantify uncertainties
arising from noisy and/or gappy data in the discovered governing equations. A
series of numerical examples including non-Newtonian channel and cavity flows
demonstrate that the added DNNs are capable of correcting the model
misspecification in PINNs and thus reduce the discrepancy between the physical
models and the observational data. We envision that the proposed approach will
extend the applications of PINNs for discovering governing equations in
problems where the physico-chemical or biological processes are not well
understood.
Related papers
- Learning solutions of parametric Navier-Stokes with physics-informed
neural networks [0.3989223013441816]
We leverageformed-Informed Neural Networks (PINs) to learn solution functions of parametric Navier-Stokes equations (NSE)
We consider the parameter(s) of interest as inputs of PINs along with coordinates, and train PINs on numerical solutions of parametric-PDES for instances of the parameters.
We show that our proposed approach results in optimizing PINN models that learn the solution functions while making sure that flow predictions are in line with conservational laws of mass and momentum.
arXiv Detail & Related papers (2024-02-05T16:19:53Z) - Splitting physics-informed neural networks for inferring the dynamics of
integer- and fractional-order neuron models [0.0]
We introduce a new approach for solving forward systems of differential equations using a combination of splitting methods and physics-informed neural networks (PINNs)
The proposed method, splitting PINN, effectively addresses the challenge of applying PINNs to forward dynamical systems.
arXiv Detail & Related papers (2023-04-26T00:11:00Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Enforcing Continuous Physical Symmetries in Deep Learning Network for
Solving Partial Differential Equations [3.6317085868198467]
We introduce a new method, symmetry-enhanced physics informed neural network (SPINN) where the invariant surface conditions induced by the Lie symmetries of PDEs are embedded into the loss function of PINN.
We show that SPINN performs better than PINN with fewer training points and simpler architecture of neural network.
arXiv Detail & Related papers (2022-06-19T00:44:22Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Respecting causality is all you need for training physics-informed
neural networks [2.1485350418225244]
PINNs have not been successful in simulating dynamical systems whose solution exhibits multi-scale, chaotic or turbulent behavior.
We propose a simple re-formulation of PINNs loss functions that can explicitly account for physical causality during model training.
This is the first time that PINNs have been successful in simulating such systems, introducing new opportunities for their applicability to problems of industrial complexity.
arXiv Detail & Related papers (2022-03-14T18:08:18Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.