On the Practicality of Deterministic Epistemic Uncertainty
- URL: http://arxiv.org/abs/2107.00649v1
- Date: Thu, 1 Jul 2021 17:59:07 GMT
- Title: On the Practicality of Deterministic Epistemic Uncertainty
- Authors: Janis Postels, Mattia Segu, Tao Sun, Luc Van Gool, Fisher Yu, Federico
Tombari
- Abstract summary: deterministic uncertainty methods (DUMs) achieve strong performance on detecting out-of-distribution data.
It remains unclear whether DUMs are well calibrated and can seamlessly scale to real-world applications.
- Score: 106.06571981780591
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A set of novel approaches for estimating epistemic uncertainty in deep neural
networks with a single forward pass has recently emerged as a valid alternative
to Bayesian Neural Networks. On the premise of informative representations,
these deterministic uncertainty methods (DUMs) achieve strong performance on
detecting out-of-distribution (OOD) data while adding negligible computational
costs at inference time. However, it remains unclear whether DUMs are well
calibrated and can seamlessly scale to real-world applications - both
prerequisites for their practical deployment. To this end, we first provide a
taxonomy of DUMs, evaluate their calibration under continuous distributional
shifts and their performance on OOD detection for image classification tasks.
Then, we extend the most promising approaches to semantic segmentation. We find
that, while DUMs scale to realistic vision tasks and perform well on OOD
detection, the practicality of current methods is undermined by poor
calibration under realistic distributional shifts.
Related papers
- Adaptive Uncertainty Estimation via High-Dimensional Testing on Latent
Representations [28.875819909902244]
Uncertainty estimation aims to evaluate the confidence of a trained deep neural network.
Existing uncertainty estimation approaches rely on low-dimensional distributional assumptions.
We propose a new framework using data-adaptive high-dimensional hypothesis testing for uncertainty estimation.
arXiv Detail & Related papers (2023-10-25T12:22:18Z) - Gaussian Latent Representations for Uncertainty Estimation using
Mahalanobis Distance in Deep Classifiers [1.5088605208312555]
We present a lightweight, fast, and high-performance regularization method for Mahalanobis distance-based uncertainty prediction.
We show the applicability of our method to a real-life computer vision use case on microorganism classification.
arXiv Detail & Related papers (2023-05-23T09:18:47Z) - Uncertainty Estimation by Fisher Information-based Evidential Deep
Learning [61.94125052118442]
Uncertainty estimation is a key factor that makes deep learning reliable in practical applications.
We propose a novel method, Fisher Information-based Evidential Deep Learning ($mathcalI$-EDL)
In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes.
arXiv Detail & Related papers (2023-03-03T16:12:59Z) - Improving Out-of-Distribution Detection via Epistemic Uncertainty
Adversarial Training [29.4569172720654]
We develop a simple adversarial training scheme that incorporates an attack of the uncertainty predicted by the dropout ensemble.
We demonstrate this method improves OOD detection performance on standard data (i.e., not adversarially crafted), and improves the standardized partial AUC from near-random guessing performance to $geq 0.75$.
arXiv Detail & Related papers (2022-09-05T14:32:19Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Scalable Marginal Likelihood Estimation for Model Selection in Deep
Learning [78.83598532168256]
Marginal-likelihood based model-selection is rarely used in deep learning due to estimation difficulties.
Our work shows that marginal likelihoods can improve generalization and be useful when validation data is unavailable.
arXiv Detail & Related papers (2021-04-11T09:50:24Z) - Sketching Curvature for Efficient Out-of-Distribution Detection for Deep
Neural Networks [32.629801680158685]
Sketching Curvature of OoD Detection (SCOD) is an architecture-agnostic framework for equipping trained Deep Neural Networks with task-relevant uncertainty estimates.
We demonstrate that SCOD achieves comparable or better OoD detection performance with lower computational burden relative to existing baselines.
arXiv Detail & Related papers (2021-02-24T21:34:40Z) - Learn what you can't learn: Regularized Ensembles for Transductive
Out-of-distribution Detection [76.39067237772286]
We show that current out-of-distribution (OOD) detection algorithms for neural networks produce unsatisfactory results in a variety of OOD detection scenarios.
This paper studies how such "hard" OOD scenarios can benefit from adjusting the detection method after observing a batch of the test data.
We propose a novel method that uses an artificial labeling scheme for the test data and regularization to obtain ensembles of models that produce contradictory predictions only on the OOD samples in a test batch.
arXiv Detail & Related papers (2020-12-10T16:55:13Z) - Revisiting One-vs-All Classifiers for Predictive Uncertainty and
Out-of-Distribution Detection in Neural Networks [22.34227625637843]
We investigate how the parametrization of the probabilities in discriminative classifiers affects the uncertainty estimates.
We show that one-vs-all formulations can improve calibration on image classification tasks.
arXiv Detail & Related papers (2020-07-10T01:55:02Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.