Objective Evaluation of Deep Uncertainty Predictions for COVID-19
Detection
- URL: http://arxiv.org/abs/2012.11840v1
- Date: Tue, 22 Dec 2020 05:43:42 GMT
- Title: Objective Evaluation of Deep Uncertainty Predictions for COVID-19
Detection
- Authors: Hamzeh Asgharnezhad, Afshar Shamsi, Roohallah Alizadehsani, Abbas
Khosravi, Saeid Nahavandi, Zahra Alizadeh Sani, and Dipti Srinivasan
- Abstract summary: Deep neural networks (DNNs) have been widely applied for detecting COVID-19 in medical images.
Here we apply and evaluate three uncertainty quantification techniques for COVID-19 detection using chest X-Ray (CXR) images.
- Score: 15.036447340859546
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural networks (DNNs) have been widely applied for detecting COVID-19
in medical images. Existing studies mainly apply transfer learning and other
data representation strategies to generate accurate point estimates. The
generalization power of these networks is always questionable due to being
developed using small datasets and failing to report their predictive
confidence. Quantifying uncertainties associated with DNN predictions is a
prerequisite for their trusted deployment in medical settings. Here we apply
and evaluate three uncertainty quantification techniques for COVID-19 detection
using chest X-Ray (CXR) images. The novel concept of uncertainty confusion
matrix is proposed and new performance metrics for the objective evaluation of
uncertainty estimates are introduced. Through comprehensive experiments, it is
shown that networks pertained on CXR images outperform networks pretrained on
natural image datasets such as ImageNet. Qualitatively and quantitatively
evaluations also reveal that the predictive uncertainty estimates are
statistically higher for erroneous predictions than correct predictions.
Accordingly, uncertainty quantification methods are capable of flagging risky
predictions with high uncertainty estimates. We also observe that ensemble
methods more reliably capture uncertainties during the inference.
Related papers
- SepsisLab: Early Sepsis Prediction with Uncertainty Quantification and Active Sensing [67.8991481023825]
Sepsis is the leading cause of in-hospital mortality in the USA.
Existing predictive models are usually trained on high-quality data with few missing information.
For the potential high-risk patients with low confidence due to limited observations, we propose a robust active sensing algorithm.
arXiv Detail & Related papers (2024-07-24T04:47:36Z) - Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Adaptive Uncertainty Estimation via High-Dimensional Testing on Latent
Representations [28.875819909902244]
Uncertainty estimation aims to evaluate the confidence of a trained deep neural network.
Existing uncertainty estimation approaches rely on low-dimensional distributional assumptions.
We propose a new framework using data-adaptive high-dimensional hypothesis testing for uncertainty estimation.
arXiv Detail & Related papers (2023-10-25T12:22:18Z) - Improving Trustworthiness of AI Disease Severity Rating in Medical
Imaging with Ordinal Conformal Prediction Sets [0.7734726150561088]
A lack of statistically rigorous uncertainty quantification is a significant factor undermining trust in AI results.
Recent developments in distribution-free uncertainty quantification present practical solutions for these issues.
We demonstrate a technique for forming ordinal prediction sets that are guaranteed to contain the correct stenosis severity.
arXiv Detail & Related papers (2022-07-05T18:01:20Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Confidence Aware Neural Networks for Skin Cancer Detection [12.300911283520719]
We present three different methods for quantifying uncertainties for skin cancer detection from images.
The obtained results reveal that the predictive uncertainty estimation methods are capable of flagging risky and erroneous predictions.
We also demonstrate that ensemble approaches are more reliable in capturing uncertainties through inference.
arXiv Detail & Related papers (2021-07-19T19:21:57Z) - Interpreting Uncertainty in Model Predictions For COVID-19 Diagnosis [0.0]
COVID-19 has brought in the need to use assistive tools for faster diagnosis in addition to typical lab swab testing.
Traditional convolutional networks use point estimate for predictions, lacking in capture of uncertainty.
We develop a visualization framework to address interpretability of uncertainty and its components, with uncertainty in predictions computed with a Bayesian Convolutional Neural Network.
arXiv Detail & Related papers (2020-10-26T01:27:29Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Learning to Predict Error for MRI Reconstruction [67.76632988696943]
We demonstrate that predictive uncertainty estimated by the current methods does not highly correlate with prediction error.
We propose a novel method that estimates the target labels and magnitude of the prediction error in two steps.
arXiv Detail & Related papers (2020-02-13T15:55:32Z) - Validating uncertainty in medical image translation [7.565565370757736]
We investigate using dropout to estimate uncertainty in a CT-to-MR image translation task.
We show that both types of uncertainty are captured, as defined, providing confidence in the output uncertainty estimates.
arXiv Detail & Related papers (2020-02-11T19:06:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.