Uncertainty estimation for out-of-distribution detection in
computational histopathology
- URL: http://arxiv.org/abs/2210.09909v1
- Date: Tue, 18 Oct 2022 14:49:44 GMT
- Title: Uncertainty estimation for out-of-distribution detection in
computational histopathology
- Authors: Lea Goetz
- Abstract summary: We show that a distance-aware uncertainty estimation method outperforms commonly used approaches.
We also investigate the use of uncertainty thresholding to reject out-of-distribution samples for selective prediction.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In computational histopathology algorithms now outperform humans on a range
of tasks, but to date none are employed for automated diagnoses in the clinic.
Before algorithms can be involved in such high-stakes decisions they need to
"know when they don't know", i.e., they need to estimate their predictive
uncertainty. This allows them to defer potentially erroneous predictions to a
human pathologist, thus increasing their safety. Here, we evaluate the
predictive performance and calibration of several uncertainty estimation
methods on clinical histopathology data. We show that a distance-aware
uncertainty estimation method outperforms commonly used approaches, such as
Monte Carlo dropout and deep ensembles. However, we observe a drop in
predictive performance and calibration on novel samples across all uncertainty
estimation methods tested. We also investigate the use of uncertainty
thresholding to reject out-of-distribution samples for selective prediction. We
demonstrate the limitations of this approach and suggest areas for future
research.
Related papers
- SepsisLab: Early Sepsis Prediction with Uncertainty Quantification and Active Sensing [67.8991481023825]
Sepsis is the leading cause of in-hospital mortality in the USA.
Existing predictive models are usually trained on high-quality data with few missing information.
For the potential high-risk patients with low confidence due to limited observations, we propose a robust active sensing algorithm.
arXiv Detail & Related papers (2024-07-24T04:47:36Z) - Efficient Normalized Conformal Prediction and Uncertainty Quantification
for Anti-Cancer Drug Sensitivity Prediction with Deep Regression Forests [0.0]
Conformal Prediction has emerged as a promising method to pair machine learning models with prediction intervals.
We propose a method to estimate the uncertainty of each sample by calculating the variance obtained from a Deep Regression Forest.
arXiv Detail & Related papers (2024-02-21T19:09:53Z) - One step closer to unbiased aleatoric uncertainty estimation [71.55174353766289]
We propose a new estimation method by actively de-noising the observed data.
By conducting a broad range of experiments, we demonstrate that our proposed approach provides a much closer approximation to the actual data uncertainty than the standard method.
arXiv Detail & Related papers (2023-12-16T14:59:11Z) - Quantifying Uncertainty in Deep Learning Classification with Noise in
Discrete Inputs for Risk-Based Decision Making [1.529943343419486]
We propose a mathematical framework to quantify prediction uncertainty for Deep Neural Network (DNN) models.
The prediction uncertainty arises from errors in predictors that follow some known finite discrete distribution.
Our proposed framework can support risk-based decision making in applications when discrete errors in predictors are present.
arXiv Detail & Related papers (2023-10-09T19:26:24Z) - Ensemble Neural Networks for Remaining Useful Life (RUL) Prediction [0.39287497907611874]
A core part of maintenance planning is a monitoring system that provides a good prognosis on health and degradation.
Here, we propose ensemble neural networks for probabilistic RUL predictions which considers both uncertainties and decouples these two uncertainties.
This method is tested on NASA's turbofan jet engine CMAPSS data-set.
arXiv Detail & Related papers (2023-09-21T19:38:44Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Auditing for Human Expertise [12.967730957018688]
We develop a statistical framework under which we can pose this question as a natural hypothesis test.
We propose a simple procedure which tests whether expert predictions are statistically independent from the outcomes of interest.
A rejection of our test thus suggests that human experts may add value to any algorithm trained on the available data.
arXiv Detail & Related papers (2023-06-02T16:15:24Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - DEUP: Direct Epistemic Uncertainty Prediction [56.087230230128185]
Epistemic uncertainty is part of out-of-sample prediction error due to the lack of knowledge of the learner.
We propose a principled approach for directly estimating epistemic uncertainty by learning to predict generalization error and subtracting an estimate of aleatoric uncertainty.
arXiv Detail & Related papers (2021-02-16T23:50:35Z) - Uncertainty estimation for classification and risk prediction on medical
tabular data [0.0]
This work advances the understanding of uncertainty estimation for classification and risk prediction on medical data.
In a data-scarce field such as healthcare, the ability to measure the uncertainty of a model's prediction could potentially lead to improved effectiveness of decision support tools.
arXiv Detail & Related papers (2020-04-13T08:46:41Z) - Learning to Predict Error for MRI Reconstruction [67.76632988696943]
We demonstrate that predictive uncertainty estimated by the current methods does not highly correlate with prediction error.
We propose a novel method that estimates the target labels and magnitude of the prediction error in two steps.
arXiv Detail & Related papers (2020-02-13T15:55:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.