Credal Concept Bottleneck Models: Structural Separation of Epistemic and Aleatoric Uncertainty
- URL: http://arxiv.org/abs/2602.11219v1
- Date: Wed, 11 Feb 2026 10:54:57 GMT
- Title: Credal Concept Bottleneck Models: Structural Separation of Epistemic and Aleatoric Uncertainty
- Authors: Tanmoy Mukherjee, Marius Kloft, Pierre Marquis, Zied Bouraoui,
- Abstract summary: We propose a credal-set formulation in which uncertainty is represented as a set of predictive distributions.<n>We instantiate this idea in a Variational Credal Concept Bottleneck Model with two disjoint uncertainty heads trained by disjoint objectives.
- Score: 38.040646841317965
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Decomposing predictive uncertainty into epistemic (model ignorance) and aleatoric (data ambiguity) components is central to reliable decision making, yet most methods estimate both from the same predictive distribution. Recent empirical and theoretical results show these estimates are typically strongly correlated, so changes in predictive spread simultaneously affect both components and blur their semantics. We propose a credal-set formulation in which uncertainty is represented as a set of predictive distributions, so that epistemic and aleatoric uncertainty correspond to distinct geometric properties: the size of the set versus the noise within its elements. We instantiate this idea in a Variational Credal Concept Bottleneck Model with two disjoint uncertainty heads trained by disjoint objectives and non-overlapping gradient paths, yielding separation by construction rather than post hoc decomposition. Across multi-annotator benchmarks, our approach reduces the correlation between epistemic and aleatoric uncertainty by over an order of magnitude compared to standard methods, while improving the alignment of epistemic uncertainty with prediction error and aleatoric uncertainty with ground-truth ambiguity.
Related papers
- Set-based v.s. Distribution-based Representations of Epistemic Uncertainty: A Comparative Study [15.533120446404228]
Epistemic uncertainty in neural networks is commonly modeled using two second-order paradigms.<n>We present a comparative, like-for-like evaluation of the two paradigms.
arXiv Detail & Related papers (2026-02-26T08:36:09Z) - Bridging the Gap Between Bayesian Deep Learning and Ensemble Weather Forecasts [100.26854618129039]
Weather forecasting is fundamentally challenged by the chaotic nature of the atmosphere.<n>Recent advances in Bayesian Deep Learning (BDL) offer a promising but often disconnected alternative.<n>We bridge these paradigms through a unified hybrid BDL framework for ensemble weather forecasting.
arXiv Detail & Related papers (2025-11-18T07:49:52Z) - Uncertainty Estimation using Variance-Gated Distributions [0.6340400318304492]
We propose an intuitive framework for uncertainty estimation and decomposition based on the signal-to-noise ratio of class probability distributions.<n>We introduce a variance-gated measure that scales predictions by a confidence factor derived from ensembles.
arXiv Detail & Related papers (2025-09-07T16:19:21Z) - Why Machine Learning Models Fail to Fully Capture Epistemic Uncertainty [3.4970971805884474]
We make use of a more fine-grained taxonomy of epistemic uncertainty sources in machine learning models.<n>We show that high model bias can lead to misleadingly low estimates of epistemic uncertainty.<n>Common second-order uncertainty methods systematically blur bias-induced errors into aleatoric estimates.
arXiv Detail & Related papers (2025-05-29T14:50:46Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Uncertainty Quantification for Traffic Forecasting: A Unified Approach [21.556559649467328]
Uncertainty is an essential consideration for time series forecasting tasks.
In this work, we focus on quantifying the uncertainty of traffic forecasting.
We develop Deep S-Temporal Uncertainty Quantification (STUQ), which can estimate both aleatoric and relational uncertainty.
arXiv Detail & Related papers (2022-08-11T15:21:53Z) - Dense Uncertainty Estimation via an Ensemble-based Conditional Latent
Variable Model [68.34559610536614]
We argue that the aleatoric uncertainty is an inherent attribute of the data and can only be correctly estimated with an unbiased oracle model.
We propose a new sampling and selection strategy at train time to approximate the oracle model for aleatoric uncertainty estimation.
Our results show that our solution achieves both accurate deterministic results and reliable uncertainty estimation.
arXiv Detail & Related papers (2021-11-22T08:54:10Z) - DEUP: Direct Epistemic Uncertainty Prediction [56.087230230128185]
Epistemic uncertainty is part of out-of-sample prediction error due to the lack of knowledge of the learner.
We propose a principled approach for directly estimating epistemic uncertainty by learning to predict generalization error and subtracting an estimate of aleatoric uncertainty.
arXiv Detail & Related papers (2021-02-16T23:50:35Z) - The Hidden Uncertainty in a Neural Networks Activations [105.4223982696279]
The distribution of a neural network's latent representations has been successfully used to detect out-of-distribution (OOD) data.
This work investigates whether this distribution correlates with a model's epistemic uncertainty, thus indicating its ability to generalise to novel inputs.
arXiv Detail & Related papers (2020-12-05T17:30:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.