Concept-based model explanations for Electronic Health Records
- URL: http://arxiv.org/abs/2012.02308v2
- Date: Mon, 8 Mar 2021 10:30:50 GMT
- Title: Concept-based model explanations for Electronic Health Records
- Authors: Diana Mincu, Eric Loreaux, Shaobo Hou, Sebastien Baur, Ivan Protsyuk,
Martin G Seneviratne, Anne Mottram, Nenad Tomasev, Alan Karthikesanlingam and
Jessica Schrouff
- Abstract summary: Testing with Concept Activation Vectors (TCAV) has recently been introduced as a way of providing human-understandable explanations.
We propose an extension of the method to time series data to enable an application of TCAV to sequential predictions in the EHR.
- Score: 1.6837409766909865
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recurrent Neural Networks (RNNs) are often used for sequential modeling of
adverse outcomes in electronic health records (EHRs) due to their ability to
encode past clinical states. These deep, recurrent architectures have displayed
increased performance compared to other modeling approaches in a number of
tasks, fueling the interest in deploying deep models in clinical settings. One
of the key elements in ensuring safe model deployment and building user trust
is model explainability. Testing with Concept Activation Vectors (TCAV) has
recently been introduced as a way of providing human-understandable
explanations by comparing high-level concepts to the network's gradients. While
the technique has shown promising results in real-world imaging applications,
it has not been applied to structured temporal inputs. To enable an application
of TCAV to sequential predictions in the EHR, we propose an extension of the
method to time series data. We evaluate the proposed approach on an open EHR
benchmark from the intensive care unit, as well as synthetic data where we are
able to better isolate individual effects.
Related papers
- Synthesizing Multimodal Electronic Health Records via Predictive Diffusion Models [69.06149482021071]
We propose a novel EHR data generation model called EHRPD.
It is a diffusion-based model designed to predict the next visit based on the current one while also incorporating time interval estimation.
We conduct experiments on two public datasets and evaluate EHRPD from fidelity, privacy, and utility perspectives.
arXiv Detail & Related papers (2024-06-20T02:20:23Z) - Rethinking Model Prototyping through the MedMNIST+ Dataset Collection [0.11999555634662634]
This work presents a benchmark for the MedMNIST+ database to diversify the evaluation landscape.
We conduct a thorough analysis of common convolutional neural networks (CNNs) and Transformer-based architectures, for medical image classification.
Our findings suggest that computationally efficient training schemes and modern foundation models hold promise in bridging the gap between expensive end-to-end training and more resource-refined approaches.
arXiv Detail & Related papers (2024-04-24T10:19:25Z) - Recent Advances in Predictive Modeling with Electronic Health Records [71.19967863320647]
utilizing EHR data for predictive modeling presents several challenges due to its unique characteristics.
Deep learning has demonstrated its superiority in various applications, including healthcare.
arXiv Detail & Related papers (2024-02-02T00:31:01Z) - Physics Inspired Hybrid Attention for SAR Target Recognition [61.01086031364307]
We propose a physics inspired hybrid attention (PIHA) mechanism and the once-for-all (OFA) evaluation protocol to address the issues.
PIHA leverages the high-level semantics of physical information to activate and guide the feature group aware of local semantics of target.
Our method outperforms other state-of-the-art approaches in 12 test scenarios with same ASC parameters.
arXiv Detail & Related papers (2023-09-27T14:39:41Z) - Robustness and Generalization Performance of Deep Learning Models on
Cyber-Physical Systems: A Comparative Study [71.84852429039881]
Investigation focuses on the models' ability to handle a range of perturbations, such as sensor faults and noise.
We test the generalization and transfer learning capabilities of these models by exposing them to out-of-distribution (OOD) samples.
arXiv Detail & Related papers (2023-06-13T12:43:59Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - On the Importance of Clinical Notes in Multi-modal Learning for EHR Data [0.0]
Previous research has shown that jointly using clinical notes with electronic health record data improved predictive performance for patient monitoring.
We first confirm that performance significantly improves over state-of-the-art EHR data models when combining EHR data and clinical notes.
We then provide an analysis showing improvements arise almost exclusively from a subset of notes containing broader context on patient state rather than clinician notes.
arXiv Detail & Related papers (2022-12-06T15:18:57Z) - Simple Recurrent Neural Networks is all we need for clinical events
predictions using EHR data [22.81278657120305]
Recurrent neural networks (RNNs) are common architecture for EHR-based clinical events predictive models.
We used two prediction tasks: the risk for developing heart failure and the risk of early readmission for inpatient hospitalization.
We found that simple gated RNN models, including GRUs and LSTMs, often offer competitive results when properly tuned with Bayesian Optimization.
arXiv Detail & Related papers (2021-10-03T13:07:23Z) - Unifying Heterogenous Electronic Health Records Systems via Text-Based
Code Embedding [7.3394352452936085]
We introduce DescEmb, a code-agnostic description-based representation learning framework for predictive modeling on EHR.
We tested our model's capacity on various experiments including prediction tasks, transfer learning and pooled learning.
arXiv Detail & Related papers (2021-08-08T12:47:42Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.