Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit
- URL: http://arxiv.org/abs/2304.07025v1
- Date: Fri, 14 Apr 2023 09:39:06 GMT
- Title: Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit
- Authors: Oisin Fitzgerald, Oscar Perez-Concha, Blanca Gallego-Luxan, Alejandro
Metke-Jimenez, Lachlan Rudd, Louisa Jorm
- Abstract summary: Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
- Score: 56.801856519460465
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Irregularly measured time series are common in many of the applied settings
in which time series modelling is a key statistical tool, including medicine.
This provides challenges in model choice, often necessitating imputation or
similar strategies. Continuous time autoregressive recurrent neural networks
(CTRNNs) are a deep learning model that account for irregular observations
through incorporating continuous evolution of the hidden states between
observations. This is achieved using a neural ordinary differential equation
(ODE) or neural flow layer. In this manuscript, we give an overview of these
models, including the varying architectures that have been proposed to account
for issues such as ongoing medical interventions. Further, we demonstrate the
application of these models to probabilistic forecasting of blood glucose in a
critical care setting using electronic medical record and simulated data. The
experiments confirm that addition of a neural ODE or neural flow layer
generally improves the performance of autoregressive recurrent neural networks
in the irregular measurement setting. However, several CTRNN architecture are
outperformed by an autoregressive gradient boosted tree model (Catboost), with
only a long short-term memory (LSTM) and neural ODE based architecture
(ODE-LSTM) achieving comparable performance on probabilistic forecasting
metrics such as the continuous ranked probability score (ODE-LSTM:
0.118$\pm$0.001; Catboost: 0.118$\pm$0.001), ignorance score (0.152$\pm$0.008;
0.149$\pm$0.002) and interval score (175$\pm$1; 176$\pm$1).
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Neural Differential Recurrent Neural Network with Adaptive Time Steps [11.999568208578799]
We propose an RNN-based model, called RNN-ODE-Adap, that uses a neural ODE to represent the time development of the hidden states.
We adaptively select time steps based on the steepness of changes of the data over time so as to train the model more efficiently for the "spike-like" time series.
arXiv Detail & Related papers (2023-06-02T16:46:47Z) - Brain-Inspired Spiking Neural Network for Online Unsupervised Time
Series Prediction [13.521272923545409]
We present a novel Continuous Learning-based Unsupervised Recurrent Spiking Neural Network Model (CLURSNN)
CLURSNN makes online predictions by reconstructing the underlying dynamical system using Random Delay Embedding.
We show that the proposed online time series prediction methodology outperforms state-of-the-art DNN models when predicting an evolving Lorenz63 dynamical system.
arXiv Detail & Related papers (2023-04-10T16:18:37Z) - A predictive physics-aware hybrid reduced order model for reacting flows [65.73506571113623]
A new hybrid predictive Reduced Order Model (ROM) is proposed to solve reacting flow problems.
The number of degrees of freedom is reduced from thousands of temporal points to a few POD modes with their corresponding temporal coefficients.
Two different deep learning architectures have been tested to predict the temporal coefficients.
arXiv Detail & Related papers (2023-01-24T08:39:20Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Stacked Residuals of Dynamic Layers for Time Series Anomaly Detection [0.0]
We present an end-to-end differentiable neural network architecture to perform anomaly detection in multivariate time series.
The architecture is a cascade of dynamical systems designed to separate linearly predictable components of the signal.
The anomaly detector exploits the temporal structure of the prediction residuals to detect both isolated point anomalies and set-point changes.
arXiv Detail & Related papers (2022-02-25T01:50:22Z) - ANNETTE: Accurate Neural Network Execution Time Estimation with Stacked
Models [56.21470608621633]
We propose a time estimation framework to decouple the architectural search from the target hardware.
The proposed methodology extracts a set of models from micro- kernel and multi-layer benchmarks and generates a stacked model for mapping and network execution time estimation.
We compare estimation accuracy and fidelity of the generated mixed models, statistical models with the roofline model, and a refined roofline model for evaluation.
arXiv Detail & Related papers (2021-05-07T11:39:05Z) - CARRNN: A Continuous Autoregressive Recurrent Neural Network for Deep
Representation Learning from Sporadic Temporal Data [1.8352113484137622]
In this paper, a novel deep learning-based model is developed for modeling multiple temporal features in sporadic data.
The proposed model, called CARRNN, uses a generalized discrete-time autoregressive model that is trainable end-to-end using neural networks modulated by time lags.
It is applied to multivariate time-series regression tasks using data provided for Alzheimer's disease progression modeling and intensive care unit (ICU) mortality rate prediction.
arXiv Detail & Related papers (2021-04-08T12:43:44Z) - Dynamic Time Warping as a New Evaluation for Dst Forecast with Machine
Learning [0.0]
We train a neural network to make a forecast of the disturbance storm time index at origin time $t$ with a forecasting horizon of 1 up to 6 hours.
Inspection of the model's results with the correlation coefficient and RMSE indicated a performance comparable to the latest publications.
A new method is proposed to measure whether two time series are shifted in time with respect to each other.
arXiv Detail & Related papers (2020-06-08T15:14:13Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.