Bayesian Physics-Informed Neural Networks for real-world nonlinear
dynamical systems
- URL: http://arxiv.org/abs/2205.08304v1
- Date: Thu, 12 May 2022 19:04:31 GMT
- Title: Bayesian Physics-Informed Neural Networks for real-world nonlinear
dynamical systems
- Authors: Kevin Linka, Amelie Schafer, Xuhui Meng, Zongren Zou, George Em
Karniadakis, Ellen Kuhl
- Abstract summary: We integrate data, physics, and uncertainties by combining neural networks, physics-informed modeling, and Bayesian inference.
Our study reveals the inherent advantages and disadvantages of Neural Networks, Bayesian Inference, and a combination of both.
We anticipate that the underlying concepts and trends generalize to more complex disease conditions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Understanding real-world dynamical phenomena remains a challenging task.
Across various scientific disciplines, machine learning has advanced as the
go-to technology to analyze nonlinear dynamical systems, identify patterns in
big data, and make decision around them. Neural networks are now consistently
used as universal function approximators for data with underlying mechanisms
that are incompletely understood or exceedingly complex. However, neural
networks alone ignore the fundamental laws of physics and often fail to make
plausible predictions. Here we integrate data, physics, and uncertainties by
combining neural networks, physics-informed modeling, and Bayesian inference to
improve the predictive potential of traditional neural network models. We embed
the physical model of a damped harmonic oscillator into a fully-connected
feed-forward neural network to explore a simple and illustrative model system,
the outbreak dynamics of COVID-19. Our Physics-Informed Neural Networks can
seamlessly integrate data and physics, robustly solve forward and inverse
problems, and perform well for both interpolation and extrapolation, even for a
small amount of noisy and incomplete data. At only minor additional cost, they
can self-adaptively learn the weighting between data and physics. Combined with
Bayesian Neural Networks, they can serve as priors in a Bayesian Inference, and
provide credible intervals for uncertainty quantification. Our study reveals
the inherent advantages and disadvantages of Neural Networks, Bayesian
Inference, and a combination of both and provides valuable guidelines for model
selection. While we have only demonstrated these approaches for the simple
model problem of a seasonal endemic infectious disease, we anticipate that the
underlying concepts and trends generalize to more complex disease conditions
and, more broadly, to a wide variety of nonlinear dynamical systems.
Related papers
- Can physical information aid the generalization ability of Neural
Networks for hydraulic modeling? [0.0]
Application of Neural Networks to river hydraulics is fledgling, despite the field suffering from data scarcity.
We propose to mitigate such problem by introducing physical information into the training phase.
We show that incorporating such soft physical information can improve predictive capabilities.
arXiv Detail & Related papers (2024-03-13T14:51:16Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - Stretched and measured neural predictions of complex network dynamics [2.1024950052120417]
Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems.
A recently employed machine learning tool for studying dynamics is neural networks, which can be used for data-driven solution finding or discovery of differential equations.
We show that extending the model's generalizability beyond traditional statistical learning theory limits is feasible.
arXiv Detail & Related papers (2023-01-12T09:44:59Z) - A new family of Constitutive Artificial Neural Networks towards
automated model discovery [0.0]
Neural Networks are powerful approximators that can learn function relations from large data without any knowledge of the underlying physics.
We show that Constive Neural Networks have potential paradigm shift in user-defined model selection to automated model discovery.
arXiv Detail & Related papers (2022-09-15T18:33:37Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.