Epistemic Wrapping for Uncertainty Quantification
- URL: http://arxiv.org/abs/2505.02277v1
- Date: Sun, 04 May 2025 22:15:56 GMT
- Title: Epistemic Wrapping for Uncertainty Quantification
- Authors: Maryam Sultana, Neil Yorke-Smith, Kaizheng Wang, Shireen Kudukkil Manchingal, Muhammad Mubashar, Fabio Cuzzolin,
- Abstract summary: Uncertainty estimation is pivotal in machine learning, especially for classification tasks.<n>We introduce a novel Epistemic Wrapping' methodology aimed at improving uncertainty estimation in classification.
- Score: 6.967551514640606
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Uncertainty estimation is pivotal in machine learning, especially for classification tasks, as it improves the robustness and reliability of models. We introduce a novel `Epistemic Wrapping' methodology aimed at improving uncertainty estimation in classification. Our approach uses Bayesian Neural Networks (BNNs) as a baseline and transforms their outputs into belief function posteriors, effectively capturing epistemic uncertainty and offering an efficient and general methodology for uncertainty quantification. Comprehensive experiments employing a Bayesian Neural Network (BNN) baseline and an Interval Neural Network for inference on the MNIST, Fashion-MNIST, CIFAR-10 and CIFAR-100 datasets demonstrate that our Epistemic Wrapper significantly enhances generalisation and uncertainty quantification.
Related papers
- Enhancing Uncertainty Estimation and Interpretability via Bayesian Non-negative Decision Layer [55.66973223528494]
We develop a Bayesian Non-negative Decision Layer (BNDL), which reformulates deep neural networks as a conditional Bayesian non-negative factor analysis.<n>BNDL can model complex dependencies and provide robust uncertainty estimation.<n>We also offer theoretical guarantees that BNDL can achieve effective disentangled learning.
arXiv Detail & Related papers (2025-05-28T10:23:34Z) - Quantification of Uncertainties in Probabilistic Deep Neural Network by Implementing Boosting of Variational Inference [0.38366697175402226]
Boosted Bayesian Neural Networks (BBNN) is a novel approach that enhances neural network weight distribution approximations.<n>BBNN achieves 5% higher accuracy compared to conventional neural networks.
arXiv Detail & Related papers (2025-03-18T05:11:21Z) - Evidential Uncertainty Probes for Graph Neural Networks [3.5169632430086315]
We propose a plug-and-play framework for uncertainty quantification in Graph Neural Networks (GNNs)<n>Our Evidential Probing Network (EPN) uses a lightweight Multi-Layer-Perceptron (MLP) head to extract evidence from learned representations.<n>EPN-reg achieves state-of-the-art performance in accurate and efficient uncertainty quantification, making it suitable for real-world deployment.
arXiv Detail & Related papers (2025-03-11T07:00:54Z) - Quantifying calibration error in modern neural networks through evidence based theory [0.0]
This paper introduces a novel framework for quantifying the trustworthiness of neural networks by incorporating subjective logic into the evaluation of Expected Error (ECE)<n>We demonstrate the effectiveness of this approach through experiments on MNIST and CIFAR-10 datasets where post-calibration results indicate improved trustworthiness.<n>The proposed framework offers a more interpretable and nuanced assessment of AI models, with potential applications in sensitive domains such as healthcare and autonomous systems.
arXiv Detail & Related papers (2024-10-31T23:54:21Z) - CreINNs: Credal-Set Interval Neural Networks for Uncertainty Estimation in Classification Tasks [4.904199965391026]
This work presents a novel approach, termed Credal-Set Interval Neural Networks (CreINNs) for classification.<n>CreINNs are designed to predict an upper and a lower probability bound for each class, rather than a single probability value.<n>Experiments on standard multiclass and binary classification tasks demonstrate that the proposed CreINNs can achieve superior or comparable quality of uncertainty estimation.
arXiv Detail & Related papers (2024-01-10T10:04:49Z) - Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Uncertainty Estimation by Fisher Information-based Evidential Deep
Learning [61.94125052118442]
Uncertainty estimation is a key factor that makes deep learning reliable in practical applications.
We propose a novel method, Fisher Information-based Evidential Deep Learning ($mathcalI$-EDL)
In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes.
arXiv Detail & Related papers (2023-03-03T16:12:59Z) - Toward Robust Uncertainty Estimation with Random Activation Functions [3.0586855806896045]
We propose a novel approach for uncertainty quantification via ensembles, called Random Activation Functions (RAFs) Ensemble.
RAFs Ensemble outperforms state-of-the-art ensemble uncertainty quantification methods on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-02-28T13:17:56Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.