Second-Order Uncertainty Quantification: A Distance-Based Approach
- URL: http://arxiv.org/abs/2312.00995v1
- Date: Sat, 2 Dec 2023 01:21:41 GMT
- Title: Second-Order Uncertainty Quantification: A Distance-Based Approach
- Authors: Yusuf Sale, Viktor Bengs, Michele Caprio, Eyke H\"ullermeier
- Abstract summary: We propose a set of formal criteria that meaningful uncertainty measures for predictive uncertainty based on second-order distributions should obey.
We provide a general framework for developing uncertainty measures to account for these criteria, and offer an instantiation based on the Wasserstein distance.
- Score: 11.539320505465149
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the past couple of years, various approaches to representing and
quantifying different types of predictive uncertainty in machine learning,
notably in the setting of classification, have been proposed on the basis of
second-order probability distributions, i.e., predictions in the form of
distributions on probability distributions. A completely conclusive solution
has not yet been found, however, as shown by recent criticisms of commonly used
uncertainty measures associated with second-order distributions, identifying
undesirable theoretical properties of these measures. In light of these
criticisms, we propose a set of formal criteria that meaningful uncertainty
measures for predictive uncertainty based on second-order distributions should
obey. Moreover, we provide a general framework for developing uncertainty
measures to account for these criteria, and offer an instantiation based on the
Wasserstein distance, for which we prove that all criteria are satisfied.
Related papers
- On Information-Theoretic Measures of Predictive Uncertainty [5.8034373350518775]
Despite its significance, a consensus on the correct measurement of predictive uncertainty remains elusive.
Our proposed framework categorizes predictive uncertainty measures according to two factors: (I) The predicting model (II) The approximation of the true predictive distribution.
We empirically evaluate these measures in typical uncertainty estimation settings, such as misclassification detection, selective prediction, and out-of-distribution detection.
arXiv Detail & Related papers (2024-10-14T17:52:18Z) - Calibrated Probabilistic Forecasts for Arbitrary Sequences [58.54729945445505]
Real-world data streams can change unpredictably due to distribution shifts, feedback loops and adversarial actors.
We present a forecasting framework ensuring valid uncertainty estimates regardless of how data evolves.
arXiv Detail & Related papers (2024-09-27T21:46:42Z) - Quantifying Aleatoric and Epistemic Uncertainty with Proper Scoring Rules [19.221081896134567]
Uncertainty representation and quantification are paramount in machine learning.
We propose measures for the quantification of aleatoric and (epistemic) uncertainty based on proper scoring rules.
arXiv Detail & Related papers (2024-04-18T14:20:19Z) - Second-Order Uncertainty Quantification: Variance-Based Measures [2.3999111269325266]
This paper proposes a novel way to use variance-based measures to quantify uncertainty on the basis of second-order distributions in classification problems.
A distinctive feature of the measures is the ability to reason about uncertainties on a class-based level, which is useful in situations where nuanced decision-making is required.
arXiv Detail & Related papers (2023-12-30T16:30:52Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Uncertainty Estimation for Heatmap-based Landmark Localization [4.673063715963989]
We propose Quantile Binning, a data-driven method to categorise predictions by uncertainty with estimated error bounds.
We demonstrate this framework by comparing and contrasting three uncertainty measures.
We conclude by illustrating how filtering out gross mispredictions caught in our Quantile Bins significantly improves the proportion of predictions under an acceptable error threshold.
arXiv Detail & Related papers (2022-03-04T14:40:44Z) - Dense Uncertainty Estimation via an Ensemble-based Conditional Latent
Variable Model [68.34559610536614]
We argue that the aleatoric uncertainty is an inherent attribute of the data and can only be correctly estimated with an unbiased oracle model.
We propose a new sampling and selection strategy at train time to approximate the oracle model for aleatoric uncertainty estimation.
Our results show that our solution achieves both accurate deterministic results and reliable uncertainty estimation.
arXiv Detail & Related papers (2021-11-22T08:54:10Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Multi-label Chaining with Imprecise Probabilities [0.0]
We present two different strategies to extend the classical multi-label chaining approach to handle imprecise probability estimates.
The main reasons one could have for using such estimations are (1) to make cautious predictions when a high uncertainty is detected in the chaining and (2) to make better precise predictions by avoiding biases caused in early decisions in the chaining.
Our experimental results on missing labels, which investigate how reliable these predictions are in both approaches, indicate that our approaches produce relevant cautiousness on those hard-to-predict instances where the precise models fail.
arXiv Detail & Related papers (2021-07-15T16:43:31Z) - Distribution-free uncertainty quantification for classification under
label shift [105.27463615756733]
We focus on uncertainty quantification (UQ) for classification problems via two avenues.
We first argue that label shift hurts UQ, by showing degradation in coverage and calibration.
We examine these techniques theoretically in a distribution-free framework and demonstrate their excellent practical performance.
arXiv Detail & Related papers (2021-03-04T20:51:03Z) - DEUP: Direct Epistemic Uncertainty Prediction [56.087230230128185]
Epistemic uncertainty is part of out-of-sample prediction error due to the lack of knowledge of the learner.
We propose a principled approach for directly estimating epistemic uncertainty by learning to predict generalization error and subtracting an estimate of aleatoric uncertainty.
arXiv Detail & Related papers (2021-02-16T23:50:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.