Physics-Incorporated Convolutional Recurrent Neural Networks for Source
Identification and Forecasting of Dynamical Systems
- URL: http://arxiv.org/abs/2004.06243v3
- Date: Tue, 31 Aug 2021 03:03:46 GMT
- Title: Physics-Incorporated Convolutional Recurrent Neural Networks for Source
Identification and Forecasting of Dynamical Systems
- Authors: Priyabrata Saha, Saurabh Dash, Saibal Mukhopadhyay
- Abstract summary: In this paper, we present a hybrid framework combining numerical physics-based models with deep learning for source identification.
We formulate our model PhICNet as a convolutional recurrent neural network (RNN) which is end-to-end trainable for predicting S-temporal evolution.
Experimental results show that the proposed model can forecast the dynamics for a relatively long time and identify the sources as well.
- Score: 10.689157154434499
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spatio-temporal dynamics of physical processes are generally modeled using
partial differential equations (PDEs). Though the core dynamics follows some
principles of physics, real-world physical processes are often driven by
unknown external sources. In such cases, developing a purely analytical model
becomes very difficult and data-driven modeling can be of assistance. In this
paper, we present a hybrid framework combining physics-based numerical models
with deep learning for source identification and forecasting of spatio-temporal
dynamical systems with unobservable time-varying external sources. We formulate
our model PhICNet as a convolutional recurrent neural network (RNN) which is
end-to-end trainable for spatio-temporal evolution prediction of dynamical
systems and learns the source behavior as an internal state of the RNN.
Experimental results show that the proposed model can forecast the dynamics for
a relatively long time and identify the sources as well.
Related papers
- Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - MINN: Learning the dynamics of differential-algebraic equations and
application to battery modeling [3.900623554490941]
We propose a novel architecture for generating model-integrated neural networks (MINN)
MINN allows integration on the level of learning physics-based dynamics of the system.
We apply the proposed neural network architecture to model the electrochemical dynamics of lithium-ion batteries.
arXiv Detail & Related papers (2023-04-27T09:11:40Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Stretched and measured neural predictions of complex network dynamics [2.1024950052120417]
Data-driven approximations of differential equations present a promising alternative to traditional methods for uncovering a model of dynamical systems.
A recently employed machine learning tool for studying dynamics is neural networks, which can be used for data-driven solution finding or discovery of differential equations.
We show that extending the model's generalizability beyond traditional statistical learning theory limits is feasible.
arXiv Detail & Related papers (2023-01-12T09:44:59Z) - Thermodynamically Consistent Machine-Learned Internal State Variable
Approach for Data-Driven Modeling of Path-Dependent Materials [0.76146285961466]
Data-driven machine learning models, such as deep neural networks and recurrent neural networks (RNNs), have become viable alternatives.
This study proposes a machine-learned data robustness-driven modeling approach for path-dependent materials based on the measurable material.
arXiv Detail & Related papers (2022-05-01T23:25:08Z) - Multi-Objective Physics-Guided Recurrent Neural Networks for Identifying
Non-Autonomous Dynamical Systems [0.0]
We propose a physics-guided hybrid approach for modeling non-autonomous systems under control.
This is extended by a recurrent neural network and trained using a sophisticated multi-objective strategy.
Experiments conducted on real data reveal substantial accuracy improvements by our approach compared to a physics-based model.
arXiv Detail & Related papers (2022-04-27T14:33:02Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Physics-guided Deep Markov Models for Learning Nonlinear Dynamical
Systems with Uncertainty [6.151348127802708]
We propose a physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM)
The proposed framework takes advantage of the expressive power of deep learning, while retaining the driving physics of the dynamical system.
arXiv Detail & Related papers (2021-10-16T16:35:12Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.