Neural Modal ODEs: Integrating Physics-based Modeling with Neural ODEs
for Modeling High Dimensional Monitored Structures
- URL: http://arxiv.org/abs/2207.07883v1
- Date: Sat, 16 Jul 2022 09:30:20 GMT
- Title: Neural Modal ODEs: Integrating Physics-based Modeling with Neural ODEs
for Modeling High Dimensional Monitored Structures
- Authors: Zhilu Lai, Wei Liu, Xudong Jian, Kiran Bacsa, Limin Sun, Eleni Chatzi
- Abstract summary: This paper proposes a framework - termed Neural Modal ODEs - to integrate physics-based modeling with deep learning.
An autoencoder learns the abstract mappings from the first few items of observational data to the initial values of latent variables.
The decoder of the proposed model adopts the eigenmodes derived from an eigen-analysis applied to the linearized portion of a physics-based model.
- Score: 9.065343126886093
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The order/dimension of models derived on the basis of data is commonly
restricted by the number of observations, or in the context of monitored
systems, sensing nodes. This is particularly true for structural systems (e.g.
civil or mechanical structures), which are typically high-dimensional in
nature. In the scope of physics-informed machine learning, this paper proposes
a framework - termed Neural Modal ODEs - to integrate physics-based modeling
with deep learning (particularly, Neural Ordinary Differential Equations --
Neural ODEs) for modeling the dynamics of monitored and high-dimensional
engineered systems. In this initiating exploration, we restrict ourselves to
linear or mildly nonlinear systems. We propose an architecture that couples a
dynamic version of variational autoencoders with physics-informed Neural ODEs
(Pi-Neural ODEs). An encoder, as a part of the autoencoder, learns the abstract
mappings from the first few items of observational data to the initial values
of the latent variables, which drive the learning of embedded dynamics via
physics-informed Neural ODEs, imposing a \textit{modal model} structure to that
latent space. The decoder of the proposed model adopts the eigenmodes derived
from an eigen-analysis applied to the linearized portion of a physics-based
model: a process implicitly carrying the spatial relationship between
degrees-of-freedom (DOFs). The framework is validated on a numerical example,
and an experimental dataset of a scaled cable-stayed bridge, where the learned
hybrid model is shown to outperform a purely physics-based approach to
modeling. We further show the functionality of the proposed scheme within the
context of virtual sensing, i.e., the recovery of generalized response
quantities in unmeasured DOFs from spatially sparse data.
Related papers
- Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - KAN-ODEs: Kolmogorov-Arnold Network Ordinary Differential Equations for Learning Dynamical Systems and Hidden Physics [0.0]
Kolmogorov-Arnold networks (KANs) are an alternative to multi-layer perceptrons (MLPs)
This work applies KANs as the backbone of a neural ordinary differential equation (ODE) framework.
arXiv Detail & Related papers (2024-07-05T00:38:49Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Physics-guided Deep Markov Models for Learning Nonlinear Dynamical
Systems with Uncertainty [6.151348127802708]
We propose a physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM)
The proposed framework takes advantage of the expressive power of deep learning, while retaining the driving physics of the dynamical system.
arXiv Detail & Related papers (2021-10-16T16:35:12Z) - Neural Networks with Physics-Informed Architectures and Constraints for
Dynamical Systems Modeling [19.399031618628864]
We develop a framework to learn dynamics models from trajectory data.
We place constraints on the values of the outputs and the internal states of the model.
We experimentally demonstrate the benefits of the proposed approach on a variety of dynamical systems.
arXiv Detail & Related papers (2021-09-14T02:47:51Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Modeling System Dynamics with Physics-Informed Neural Networks Based on
Lagrangian Mechanics [3.214927790437842]
Two main modeling approaches often fail to meet requirements: first principles methods suffer from high bias, whereas data-driven modeling tends to have high variance.
We present physics-informed neural ordinary differential equations (PINODE), a hybrid model that combines the two modeling techniques to overcome the aforementioned problems.
Our findings are of interest for model-based control and system identification of mechanical systems.
arXiv Detail & Related papers (2020-05-29T15:10:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.