Interpreting Uncertainty in Model Predictions For COVID-19 Diagnosis
- URL: http://arxiv.org/abs/2010.13271v1
- Date: Mon, 26 Oct 2020 01:27:29 GMT
- Title: Interpreting Uncertainty in Model Predictions For COVID-19 Diagnosis
- Authors: Gayathiri Murugamoorthy and Naimul Khan
- Abstract summary: COVID-19 has brought in the need to use assistive tools for faster diagnosis in addition to typical lab swab testing.
Traditional convolutional networks use point estimate for predictions, lacking in capture of uncertainty.
We develop a visualization framework to address interpretability of uncertainty and its components, with uncertainty in predictions computed with a Bayesian Convolutional Neural Network.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: COVID-19, due to its accelerated spread has brought in the need to use
assistive tools for faster diagnosis in addition to typical lab swab testing.
Chest X-Rays for COVID cases tend to show changes in the lungs such as ground
glass opacities and peripheral consolidations which can be detected by deep
neural networks. However, traditional convolutional networks use point estimate
for predictions, lacking in capture of uncertainty, which makes them less
reliable for adoption. There have been several works so far in predicting COVID
positive cases with chest X-Rays. However, not much has been explored on
quantifying the uncertainty of these predictions, interpreting uncertainty, and
decomposing this to model or data uncertainty. To address these needs, we
develop a visualization framework to address interpretability of uncertainty
and its components, with uncertainty in predictions computed with a Bayesian
Convolutional Neural Network. This framework aims to understand the
contribution of individual features in the Chest-X-Ray images to predictive
uncertainty. Providing this as an assistive tool can help the radiologist
understand why the model came up with a prediction and whether the regions of
interest captured by the model for the specific prediction are of significance
in diagnosis. We demonstrate the usefulness of the tool in chest x-ray
interpretation through several test cases from a benchmark dataset.
Related papers
- Trust-informed Decision-Making Through An Uncertainty-Aware Stacked Neural Networks Framework: Case Study in COVID-19 Classification [10.265080819932614]
This study presents an uncertainty-aware stacked neural networks model for the reliable classification of COVID-19 from radiological images.
The model addresses the critical gap in uncertainty-aware modeling by focusing on accurately identifying confidently correct predictions.
The architecture integrates uncertainty quantification methods, including Monte Carlo dropout and ensemble techniques, to enhance predictive reliability.
arXiv Detail & Related papers (2024-09-19T04:20:12Z) - SepsisLab: Early Sepsis Prediction with Uncertainty Quantification and Active Sensing [67.8991481023825]
Sepsis is the leading cause of in-hospital mortality in the USA.
Existing predictive models are usually trained on high-quality data with few missing information.
For the potential high-risk patients with low confidence due to limited observations, we propose a robust active sensing algorithm.
arXiv Detail & Related papers (2024-07-24T04:47:36Z) - Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Improving Trustworthiness of AI Disease Severity Rating in Medical
Imaging with Ordinal Conformal Prediction Sets [0.7734726150561088]
A lack of statistically rigorous uncertainty quantification is a significant factor undermining trust in AI results.
Recent developments in distribution-free uncertainty quantification present practical solutions for these issues.
We demonstrate a technique for forming ordinal prediction sets that are guaranteed to contain the correct stenosis severity.
arXiv Detail & Related papers (2022-07-05T18:01:20Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Explaining COVID-19 and Thoracic Pathology Model Predictions by
Identifying Informative Input Features [47.45835732009979]
Neural networks have demonstrated remarkable performance in classification and regression tasks on chest X-rays.
Features attribution methods identify the importance of input features for the output prediction.
We evaluate our methods using both human-centric (ground-truth-based) interpretability metrics, and human-independent feature importance metrics on NIH Chest X-ray8 and BrixIA datasets.
arXiv Detail & Related papers (2021-04-01T11:42:39Z) - Deep Co-Attention Network for Multi-View Subspace Learning [73.3450258002607]
We propose a deep co-attention network for multi-view subspace learning.
It aims to extract both the common information and the complementary information in an adversarial setting.
In particular, it uses a novel cross reconstruction loss and leverages the label information to guide the construction of the latent representation.
arXiv Detail & Related papers (2021-02-15T18:46:44Z) - Objective Evaluation of Deep Uncertainty Predictions for COVID-19
Detection [15.036447340859546]
Deep neural networks (DNNs) have been widely applied for detecting COVID-19 in medical images.
Here we apply and evaluate three uncertainty quantification techniques for COVID-19 detection using chest X-Ray (CXR) images.
arXiv Detail & Related papers (2020-12-22T05:43:42Z) - An Uncertainty-aware Transfer Learning-based Framework for Covid-19
Diagnosis [10.832659320593347]
This paper proposes a deep uncertainty-aware transfer learning framework for COVID-19 detection using medical images.
Four popular convolutional neural networks (CNNs) are applied to extract deep features from chest X-ray and computed tomography (CT) images.
Extracted features are then processed by different machine learning and statistical modelling techniques to identify COVID-19 cases.
arXiv Detail & Related papers (2020-07-26T20:15:01Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.