Leveraging the structure of dynamical systems for data-driven modeling
- URL: http://arxiv.org/abs/2112.08458v1
- Date: Wed, 15 Dec 2021 20:09:20 GMT
- Title: Leveraging the structure of dynamical systems for data-driven modeling
- Authors: Alessandro Bucci, Onofrio Semeraro, Alexandre Allauzen, Sergio
Chibbaro and Lionel Mathelin
- Abstract summary: We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
- Score: 111.45324708884813
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The reliable prediction of the temporal behavior of complex systems is
required in numerous scientific fields. This strong interest is however
hindered by modeling issues: often, the governing equations describing the
physics of the system under consideration are not accessible or, when known,
their solution might require a computational time incompatible with the
prediction time constraints.
Nowadays, approximating complex systems at hand in a generic functional
format and informing it ex nihilo from available observations has become a
common practice, as illustrated by the enormous amount of scientific work
appeared in the last years. Numerous successful examples based on deep neural
networks are already available, although generalizability of the models and
margins of guarantee are often overlooked. Here, we consider Long-Short Term
Memory neural networks and thoroughly investigate the impact of the training
set and its structure on the quality of the long-term prediction. Leveraging
ergodic theory, we analyze the amount of data sufficient for a priori
guaranteeing a faithful model of the physical system.
We show how an informed design of the training set, based on invariants of
the system and the structure of the underlying attractor, significantly
improves the resulting models, opening up avenues for research within the
context of active learning. Further, the non-trivial effects of the memory
initializations when relying on memory-capable models will be illustrated. Our
findings provide evidence-based good-practice on the amount and the choice of
data required for an effective data-driven modeling of any complex dynamical
system.
Related papers
- eXponential FAmily Dynamical Systems (XFADS): Large-scale nonlinear Gaussian state-space modeling [9.52474299688276]
We introduce a low-rank structured variational autoencoder framework for nonlinear state-space graphical models.
We show that our approach consistently demonstrates the ability to learn a more predictive generative model.
arXiv Detail & Related papers (2024-03-03T02:19:49Z) - Neural Koopman prior for data assimilation [7.875955593012905]
We use a neural network architecture to embed dynamical systems in latent spaces.
We introduce methods that enable to train such a model for long-term continuous reconstruction.
The potential for self-supervised learning is also demonstrated, as we show the promising use of trained dynamical models as priors for variational data assimilation techniques.
arXiv Detail & Related papers (2023-09-11T09:04:36Z) - Robustness and Generalization Performance of Deep Learning Models on
Cyber-Physical Systems: A Comparative Study [71.84852429039881]
Investigation focuses on the models' ability to handle a range of perturbations, such as sensor faults and noise.
We test the generalization and transfer learning capabilities of these models by exposing them to out-of-distribution (OOD) samples.
arXiv Detail & Related papers (2023-06-13T12:43:59Z) - Deep networks for system identification: a Survey [56.34005280792013]
System identification learns mathematical descriptions of dynamic systems from input-output data.
Main aim of the identified model is to predict new data from previous observations.
We discuss architectures commonly adopted in the literature, like feedforward, convolutional, and recurrent networks.
arXiv Detail & Related papers (2023-01-30T12:38:31Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Learning dynamics from partial observations with structured neural ODEs [5.757156314867639]
We propose a flexible framework to incorporate a broad spectrum of physical insight into neural ODE-based system identification.
We demonstrate the performance of the proposed approach on numerical simulations and on an experimental dataset from a robotic exoskeleton.
arXiv Detail & Related papers (2022-05-25T07:54:10Z) - Bi-fidelity Modeling of Uncertain and Partially Unknown Systems using
DeepONets [0.0]
We propose a bi-fidelity modeling approach for complex physical systems.
We model the discrepancy between the true system's response and low-fidelity response in the presence of a small training dataset.
We apply the approach to model systems that have parametric uncertainty and are partially unknown.
arXiv Detail & Related papers (2022-04-03T05:30:57Z) - Symplectic Momentum Neural Networks -- Using Discrete Variational
Mechanics as a prior in Deep Learning [7.090165638014331]
This paper introduces Sympic Momentum Networks (SyMo) as models from a discrete formulation of mechanics for non-separable mechanical systems.
We show that such combination not only allows these models tol earn from limited data but also provides the models with the capability of preserving the symplectic form and show better long-term behaviour.
arXiv Detail & Related papers (2022-01-20T16:33:19Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Model-Based Deep Learning [155.063817656602]
Signal processing, communications, and control have traditionally relied on classical statistical modeling techniques.
Deep neural networks (DNNs) use generic architectures which learn to operate from data, and demonstrate excellent performance.
We are interested in hybrid techniques that combine principled mathematical models with data-driven systems to benefit from the advantages of both approaches.
arXiv Detail & Related papers (2020-12-15T16:29:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.