Identifying nonlinear dynamical systems from multi-modal time series
data
- URL: http://arxiv.org/abs/2111.02922v1
- Date: Thu, 4 Nov 2021 14:59:28 GMT
- Title: Identifying nonlinear dynamical systems from multi-modal time series
data
- Authors: Philine Lou Bommer, Daniel Kramer, Carlo Tombolini, Georgia Koppe and
Daniel Durstewitz
- Abstract summary: Empirically observed time series in physics, biology, or medicine are commonly generated by some underlying dynamical system (DS)
There is an increasing interest to harvest machine learning methods to reconstruct this latent DS in a completely data-driven, unsupervised way.
Here we propose a general framework for multi-modal data integration for the purpose of nonlinear DS identification and cross-modal prediction.
- Score: 3.721528851694675
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Empirically observed time series in physics, biology, or medicine, are
commonly generated by some underlying dynamical system (DS) which is the target
of scientific interest. There is an increasing interest to harvest machine
learning methods to reconstruct this latent DS in a completely data-driven,
unsupervised way. In many areas of science it is common to sample time series
observations from many data modalities simultaneously, e.g.
electrophysiological and behavioral time series in a typical neuroscience
experiment. However, current machine learning tools for reconstructing DSs
usually focus on just one data modality. Here we propose a general framework
for multi-modal data integration for the purpose of nonlinear DS identification
and cross-modal prediction. This framework is based on dynamically
interpretable recurrent neural networks as general approximators of nonlinear
DSs, coupled to sets of modality-specific decoder models from the class of
generalized linear models. Both an expectation-maximization and a variational
inference algorithm for model training are advanced and compared. We show on
nonlinear DS benchmarks that our algorithms can efficiently compensate for too
noisy or missing information in one data channel by exploiting other channels,
and demonstrate on experimental neuroscience data how the algorithm learns to
link different data domains to the underlying dynamics
Related papers
- Foundational Inference Models for Dynamical Systems [5.549794481031468]
We offer a fresh perspective on the classical problem of imputing missing time series data, whose underlying dynamics are assumed to be determined by ODEs.
We propose a novel supervised learning framework for zero-shot time series imputation, through parametric functions satisfying some (hidden) ODEs.
We empirically demonstrate that one and the same (pretrained) recognition model can perform zero-shot imputation across 63 distinct time series with missing values.
arXiv Detail & Related papers (2024-02-12T11:48:54Z) - Neural Koopman prior for data assimilation [7.875955593012905]
We use a neural network architecture to embed dynamical systems in latent spaces.
We introduce methods that enable to train such a model for long-term continuous reconstruction.
The potential for self-supervised learning is also demonstrated, as we show the promising use of trained dynamical models as priors for variational data assimilation techniques.
arXiv Detail & Related papers (2023-09-11T09:04:36Z) - Learning Latent Dynamics via Invariant Decomposition and
(Spatio-)Temporal Transformers [0.6767885381740952]
We propose a method for learning dynamical systems from high-dimensional empirical data.
We focus on the setting in which data are available from multiple different instances of a system.
We study behaviour through simple theoretical analyses and extensive experiments on synthetic and real-world datasets.
arXiv Detail & Related papers (2023-06-21T07:52:07Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Integrating Multimodal Data for Joint Generative Modeling of Complex Dynamics [6.848555909346641]
We provide an efficient framework to combine various sources of information for optimal reconstruction.
Our framework is fully textitgenerative, producing, after training, trajectories with the same geometrical and temporal structure as those of the ground truth system.
arXiv Detail & Related papers (2022-12-15T15:21:28Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - CARRNN: A Continuous Autoregressive Recurrent Neural Network for Deep
Representation Learning from Sporadic Temporal Data [1.8352113484137622]
In this paper, a novel deep learning-based model is developed for modeling multiple temporal features in sporadic data.
The proposed model, called CARRNN, uses a generalized discrete-time autoregressive model that is trainable end-to-end using neural networks modulated by time lags.
It is applied to multivariate time-series regression tasks using data provided for Alzheimer's disease progression modeling and intensive care unit (ICU) mortality rate prediction.
arXiv Detail & Related papers (2021-04-08T12:43:44Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.