Modeling Irregular Time Series with Continuous Recurrent Units
- URL: http://arxiv.org/abs/2111.11344v1
- Date: Mon, 22 Nov 2021 16:49:15 GMT
- Title: Modeling Irregular Time Series with Continuous Recurrent Units
- Authors: Mona Schirmer, Mazin Eltayeb, Stefan Lessmann, Maja Rudolph
- Abstract summary: We propose continuous recurrent units (CRUs) to handle irregular time intervals between observations.
We show that CRU can better interpolate irregular time series than neural ordinary differential equation (neural ODE)-based models.
We also show that our model can infer dynamics from im-ages and that the Kalman gain efficiently singles out candidates for valuable state updates from noisy observations.
- Score: 3.7335080869292483
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recurrent neural networks (RNNs) like long short-term memory networks (LSTMs)
and gated recurrent units (GRUs) are a popular choice for modeling sequential
data. Their gating mechanism permits weighting previous history encoded in a
hidden state with new information from incoming observations. In many
applications, such as medical records, observations times are irregular and
carry important information. However, LSTMs and GRUs assume constant time
intervals between observations. To address this challenge, we propose
continuous recurrent units (CRUs) -a neural architecture that can naturally
handle irregular time intervals between observations. The gating mechanism of
the CRU employs the continuous formulation of a Kalman filter and alternates
between (1) continuous latent state propagation according to a linear
stochastic differential equation (SDE) and (2) latent state updates whenever a
new observation comes in. In an empirical study, we show that the CRU can
better interpolate irregular time series than neural ordinary differential
equation (neural ODE)-based models. We also show that our model can infer
dynamics from im-ages and that the Kalman gain efficiently singles out
candidates for valuable state updates from noisy observations.
Related papers
- Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Neural Continuous-Discrete State Space Models for Irregularly-Sampled
Time Series [18.885471782270375]
NCDSSM employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables.
We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference.
Empirical results on multiple benchmark datasets show improved imputation and forecasting performance of NCDSSM over existing models.
arXiv Detail & Related papers (2023-01-26T18:45:04Z) - Continuous Depth Recurrent Neural Differential Equations [0.0]
We propose continuous depth recurrent neural differential equations (CDR-NDE) to generalize RNN models.
CDR-NDE considers two separate differential equations over each of these dimensions and models the evolution in the temporal and depth directions.
We also propose the CDR-NDE-heat model based on partial differential equations which treats the computation of hidden states as solving a heat equation over time.
arXiv Detail & Related papers (2022-12-28T06:34:32Z) - Gated Recurrent Neural Networks with Weighted Time-Delay Feedback [59.125047512495456]
We introduce a novel gated recurrent unit (GRU) with a weighted time-delay feedback mechanism.
We show that $tau$-GRU can converge faster and generalize better than state-of-the-art recurrent units and gated recurrent architectures.
arXiv Detail & Related papers (2022-12-01T02:26:34Z) - Autoregressive GNN-ODE GRU Model for Network Dynamics [7.272158647379444]
We propose an Autoregressive GNN-ODE GRU Model (AGOG) to learn and capture the continuous network dynamics.
Our model can capture the continuous dynamic process of complex systems accurately and make predictions of node states with minimal error.
arXiv Detail & Related papers (2022-11-19T05:43:10Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - CARRNN: A Continuous Autoregressive Recurrent Neural Network for Deep
Representation Learning from Sporadic Temporal Data [1.8352113484137622]
In this paper, a novel deep learning-based model is developed for modeling multiple temporal features in sporadic data.
The proposed model, called CARRNN, uses a generalized discrete-time autoregressive model that is trainable end-to-end using neural networks modulated by time lags.
It is applied to multivariate time-series regression tasks using data provided for Alzheimer's disease progression modeling and intensive care unit (ICU) mortality rate prediction.
arXiv Detail & Related papers (2021-04-08T12:43:44Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Crop Classification under Varying Cloud Cover with Neural Ordinary
Differential Equations [23.93148719731374]
State-of-the-art methods for crop classification rely on techniques that implicitly assume regular temporal spacing between observations.
We propose to use neural ordinary differential equations (NODEs) in combination with recurrent neural networks (RNNs) to classify crop types in irregularly spaced image sequences.
arXiv Detail & Related papers (2020-12-04T11:56:50Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.