Label-wise Aleatoric and Epistemic Uncertainty Quantification
- URL: http://arxiv.org/abs/2406.02354v1
- Date: Tue, 4 Jun 2024 14:33:23 GMT
- Title: Label-wise Aleatoric and Epistemic Uncertainty Quantification
- Authors: Yusuf Sale, Paul Hofman, Timo Löhr, Lisa Wimmer, Thomas Nagler, Eyke Hüllermeier,
- Abstract summary: We present a novel approach to uncertainty quantification in classification tasks based on label-wise decomposition of uncertainty measures.
We show that our proposed measures adhere to a number of desirable properties.
- Score: 15.642370299038488
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a novel approach to uncertainty quantification in classification tasks based on label-wise decomposition of uncertainty measures. This label-wise perspective allows uncertainty to be quantified at the individual class level, thereby improving cost-sensitive decision-making and helping understand the sources of uncertainty. Furthermore, it allows to define total, aleatoric, and epistemic uncertainty on the basis of non-categorical measures such as variance, going beyond common entropy-based measures. In particular, variance-based measures address some of the limitations associated with established methods that have recently been discussed in the literature. We show that our proposed measures adhere to a number of desirable properties. Through empirical evaluation on a variety of benchmark data sets -- including applications in the medical domain where accurate uncertainty quantification is crucial -- we establish the effectiveness of label-wise uncertainty quantification.
Related papers
- A Structured Review of Literature on Uncertainty in Machine Learning & Deep Learning [0.8667724053232616]
We focus on a critical concern for adaptation of Machine Learning in risk-sensitive applications, namely understanding and quantifying uncertainty.
Our paper approaches this topic in a structured way, providing a review of the literature in the various facets that uncertainty is enveloped in the ML process.
Key contributions in this review are broadening the scope of uncertainty discussion, as well as an updated review of uncertainty quantification methods in Deep Learning.
arXiv Detail & Related papers (2024-06-01T07:17:38Z) - Second-Order Uncertainty Quantification: Variance-Based Measures [2.3999111269325266]
This paper proposes a novel way to use variance-based measures to quantify uncertainty on the basis of second-order distributions in classification problems.
A distinctive feature of the measures is the ability to reason about uncertainties on a class-based level, which is useful in situations where nuanced decision-making is required.
arXiv Detail & Related papers (2023-12-30T16:30:52Z) - One step closer to unbiased aleatoric uncertainty estimation [71.55174353766289]
We propose a new estimation method by actively de-noising the observed data.
By conducting a broad range of experiments, we demonstrate that our proposed approach provides a much closer approximation to the actual data uncertainty than the standard method.
arXiv Detail & Related papers (2023-12-16T14:59:11Z) - Benchmarking Scalable Epistemic Uncertainty Quantification in Organ
Segmentation [7.313010190714819]
quantifying uncertainty associated with model predictions is crucial in critical clinical applications.
Deep learning based methods for automatic organ segmentation have shown promise in aiding diagnosis and treatment planning.
It is unclear which method is preferred in the medical image analysis setting.
arXiv Detail & Related papers (2023-08-15T00:09:33Z) - Gradient-based Uncertainty Attribution for Explainable Bayesian Deep
Learning [38.34033824352067]
Predictions made by deep learning models are prone to data perturbations, adversarial attacks, and out-of-distribution inputs.
We propose to develop explainable and actionable Bayesian deep learning methods to perform accurate uncertainty quantification.
arXiv Detail & Related papers (2023-04-10T19:14:15Z) - What is Flagged in Uncertainty Quantification? Latent Density Models for
Uncertainty Categorization [68.15353480798244]
Uncertainty Quantification (UQ) is essential for creating trustworthy machine learning models.
Recent years have seen a steep rise in UQ methods that can flag suspicious examples.
We propose a framework for categorizing uncertain examples flagged by UQ methods in classification tasks.
arXiv Detail & Related papers (2022-07-11T19:47:00Z) - Bayesian autoencoders with uncertainty quantification: Towards
trustworthy anomaly detection [78.24964622317634]
In this work, the formulation of Bayesian autoencoders (BAEs) is adopted to quantify the total anomaly uncertainty.
To evaluate the quality of uncertainty, we consider the task of classifying anomalies with the additional option of rejecting predictions of high uncertainty.
Our experiments demonstrate the effectiveness of the BAE and total anomaly uncertainty on a set of benchmark datasets and two real datasets for manufacturing.
arXiv Detail & Related papers (2022-02-25T12:20:04Z) - Ensemble-based Uncertainty Quantification: Bayesian versus Credal
Inference [0.0]
We consider ensemble-based approaches to uncertainty quantification.
We specifically focus on Bayesian methods and approaches based on so-called credal sets.
The effectiveness of corresponding measures is evaluated and compared in an empirical study on classification with a reject option.
arXiv Detail & Related papers (2021-07-21T22:47:24Z) - DEUP: Direct Epistemic Uncertainty Prediction [56.087230230128185]
Epistemic uncertainty is part of out-of-sample prediction error due to the lack of knowledge of the learner.
We propose a principled approach for directly estimating epistemic uncertainty by learning to predict generalization error and subtracting an estimate of aleatoric uncertainty.
arXiv Detail & Related papers (2021-02-16T23:50:35Z) - The Aleatoric Uncertainty Estimation Using a Separate Formulation with
Virtual Residuals [51.71066839337174]
Existing methods can quantify the error in the target estimation, but they tend to underestimate it.
We propose a new separable formulation for the estimation of a signal and of its uncertainty, avoiding the effect of overfitting.
We demonstrate that the proposed method outperforms a state-of-the-art technique for signal and uncertainty estimation.
arXiv Detail & Related papers (2020-11-03T12:11:27Z) - Localization Uncertainty Estimation for Anchor-Free Object Detection [48.931731695431374]
There are several limitations of the existing uncertainty estimation methods for anchor-based object detection.
We propose a new localization uncertainty estimation method called UAD for anchor-free object detection.
Our method captures the uncertainty in four directions of box offsets that are homogeneous, so that it can tell which direction is uncertain.
arXiv Detail & Related papers (2020-06-28T13:49:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.