Simple Recurrent Neural Networks is all we need for clinical events
predictions using EHR data
- URL: http://arxiv.org/abs/2110.00998v1
- Date: Sun, 3 Oct 2021 13:07:23 GMT
- Title: Simple Recurrent Neural Networks is all we need for clinical events
predictions using EHR data
- Authors: Laila Rasmy, Jie Zhu, Zhiheng Li, Xin Hao, Hong Thoai Tran, Yujia
Zhou, Firat Tiryaki, Yang Xiang, Hua Xu, Degui Zhi
- Abstract summary: Recurrent neural networks (RNNs) are common architecture for EHR-based clinical events predictive models.
We used two prediction tasks: the risk for developing heart failure and the risk of early readmission for inpatient hospitalization.
We found that simple gated RNN models, including GRUs and LSTMs, often offer competitive results when properly tuned with Bayesian Optimization.
- Score: 22.81278657120305
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Recently, there is great interest to investigate the application of deep
learning models for the prediction of clinical events using electronic health
records (EHR) data. In EHR data, a patient's history is often represented as a
sequence of visits, and each visit contains multiple events. As a result, deep
learning models developed for sequence modeling, like recurrent neural networks
(RNNs) are common architecture for EHR-based clinical events predictive models.
While a large variety of RNN models were proposed in the literature, it is
unclear if complex architecture innovations will offer superior predictive
performance. In order to move this field forward, a rigorous evaluation of
various methods is needed. In this study, we conducted a thorough benchmark of
RNN architectures in modeling EHR data. We used two prediction tasks: the risk
for developing heart failure and the risk of early readmission for inpatient
hospitalization. We found that simple gated RNN models, including GRUs and
LSTMs, often offer competitive results when properly tuned with Bayesian
Optimization, which is in line with similar to findings in the natural language
processing (NLP) domain. For reproducibility, Our codebase is shared at
https://github.com/ZhiGroup/pytorch_ehr.
Related papers
- Synthesizing Multimodal Electronic Health Records via Predictive Diffusion Models [69.06149482021071]
We propose a novel EHR data generation model called EHRPD.
It is a diffusion-based model designed to predict the next visit based on the current one while also incorporating time interval estimation.
We conduct experiments on two public datasets and evaluate EHRPD from fidelity, privacy, and utility perspectives.
arXiv Detail & Related papers (2024-06-20T02:20:23Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Learning to Adapt Clinical Sequences with Residual Mixture of Experts [12.881413375147996]
We propose a Mixture-of-Experts (MoE) architecture to represent complex dynamics of all patients.
The architecture consists of multiple (expert) RNN models covering patient sub-populations and refining the predictions of the base model.
We show 4.1% gain on AUPRC statistics compared to a single RNN prediction.
arXiv Detail & Related papers (2022-04-06T09:23:12Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - ONE-NAS: An Online NeuroEvolution based Neural Architecture Search for
Time Series Forecasting [3.3758186776249928]
This work presents the Online NeuroEvolution based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is the first neural architecture search algorithm capable of automatically designing and training new recurrent neural networks (RNNs) in an online setting.
It is shown to outperform traditional statistical time series forecasting, including naive, moving average, and exponential smoothing methods.
arXiv Detail & Related papers (2022-02-27T22:58:32Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Multimodal Learning for Cardiovascular Risk Prediction using EHR Data [0.9805331696863404]
We propose a recurrent neural network model for cardiovascular risk prediction that integrates both medical texts and structured clinical information.
BiLSTM model embeds word embeddings to classical clinical predictors before applying them to a final fully connected neural network.
evaluated on a data set of real world patients with manifest vascular disease or at high-risk for cardiovascular disease.
arXiv Detail & Related papers (2020-08-27T08:09:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.