Quantifying Epistemic Predictive Uncertainty in Conformal Prediction
- URL: http://arxiv.org/abs/2602.01667v1
- Date: Mon, 02 Feb 2026 05:38:07 GMT
- Title: Quantifying Epistemic Predictive Uncertainty in Conformal Prediction
- Authors: Siu Lun Chau, Soroush H. Zargarbashi, Yusuf Sale, Michele Caprio,
- Abstract summary: We study the problem of quantifying uncertainty faced at prediction time due to the existence of multiple plausible predictive models.<n>We build on recent results showing that, under a mild assumption, any full conformal prediction procedure induces a set of closed and convex predictive distributions.<n>We propose a computationally efficient and analytically tractable uncertainty measure, based on emphMaximum Mean Imprecision, to quantify the uncertainty.
- Score: 11.09458914721516
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study the problem of quantifying epistemic predictive uncertainty (EPU) -- that is, uncertainty faced at prediction time due to the existence of multiple plausible predictive models -- within the framework of conformal prediction (CP). To expose the implicit model multiplicity underlying CP, we build on recent results showing that, under a mild assumption, any full CP procedure induces a set of closed and convex predictive distributions, commonly referred to as a credal set. Importantly, the conformal prediction region (CPR) coincides exactly with the set of labels to which all distributions in the induced credal set assign probability at least $1-α$. As our first contribution, we prove that this characterisation also holds in split CP. Building on this connection, we then propose a computationally efficient and analytically tractable uncertainty measure, based on \emph{Maximum Mean Imprecision}, to quantify the EPU by measuring the degree of conflicting information within the induced credal set. Experiments on active learning and selective classification demonstrate that the quantified EPU provides substantially more informative and fine-grained uncertainty assessments than reliance on CPR size alone. More broadly, this work highlights the potential of CP serving as a principled basis for decision-making under epistemic uncertainty.
Related papers
- eCP: Informative uncertainty quantification via Equivariantized Conformal Prediction with pre-trained models [3.1424353049227727]
We study the effect of group symmetrization of pre-trained models on conformal prediction (CP)<n>We propose infusing CP with geometric information via group-averaging of the pretrained predictor to distribute the non-conformity mass across the orbits.<n>Our approach provably yields contracted non-conformity scores in increasing convex order, implying improved exponential-tail bounds and sharper conformal prediction sets in expectation.
arXiv Detail & Related papers (2026-02-03T20:18:59Z) - Bridging the Gap Between Bayesian Deep Learning and Ensemble Weather Forecasts [100.26854618129039]
Weather forecasting is fundamentally challenged by the chaotic nature of the atmosphere.<n>Recent advances in Bayesian Deep Learning (BDL) offer a promising but often disconnected alternative.<n>We bridge these paradigms through a unified hybrid BDL framework for ensemble weather forecasting.
arXiv Detail & Related papers (2025-11-18T07:49:52Z) - Credal Ensemble Distillation for Uncertainty Quantification [12.36665123584814]
We propose credal ensemble distillation (CED), a framework that compresses a deep ensemble into a single model, CREDIT, for classification tasks.<n>CED achieves superior or comparable uncertainty estimation compared to several existing baselines, while substantially reducing inference overhead compared to deep ensembles.
arXiv Detail & Related papers (2025-11-14T14:53:42Z) - Optimal Conformal Prediction under Epistemic Uncertainty [61.46247583794497]
Conformal prediction (CP) is a popular framework for representing uncertainty.<n>We introduce Bernoulli prediction sets (BPS) which produce the smallest prediction sets that ensure conditional coverage.<n>When given first-order predictions, BPS reduces to the well-known adaptive prediction sets (APS)
arXiv Detail & Related papers (2025-05-25T08:32:44Z) - SConU: Selective Conformal Uncertainty in Large Language Models [59.25881667640868]
We propose a novel approach termed Selective Conformal Uncertainty (SConU)<n>We develop two conformal p-values that are instrumental in determining whether a given sample deviates from the uncertainty distribution of the calibration set at a specific manageable risk level.<n>Our approach not only facilitates rigorous management of miscoverage rates across both single-domain and interdisciplinary contexts, but also enhances the efficiency of predictions.
arXiv Detail & Related papers (2025-04-19T03:01:45Z) - Provably Reliable Conformal Prediction Sets in the Presence of Data Poisoning [53.42244686183879]
Conformal prediction provides model-agnostic and distribution-free uncertainty quantification.<n>Yet, conformal prediction is not reliable under poisoning attacks where adversaries manipulate both training and calibration data.<n>We propose reliable prediction sets (RPS): the first efficient method for constructing conformal prediction sets with provable reliability guarantees under poisoning.
arXiv Detail & Related papers (2024-10-13T15:37:11Z) - Conformal Depression Prediction [2.097941594997818]
Conformal depression prediction (CDP) is a depression prediction method with uncertainty quantification based on conformal prediction (CP)
CDP is a plug-and-play module that requires neither model retraining nor an assumption about the depression data distribution.
We propose CDP-ACC, an improved conformal prediction with approximate conditional coverage.
arXiv Detail & Related papers (2024-05-29T03:08:30Z) - An Information Theoretic Perspective on Conformal Prediction [15.194199235970242]
Conformal Prediction (CP) constructs prediction sets guaranteed to contain the true answer with a user-specified probability.<n>In this work, we leverage information theory to connect conformal prediction to other notions of uncertainty.
arXiv Detail & Related papers (2024-05-03T14:43:07Z) - Model-free generalized fiducial inference [0.0]
Conformal prediction (CP) was developed to provide finite-sample probabilistic prediction guarantees.<n>CP algorithms are a relatively general-purpose approach to uncertainty quantification, with finite-sample guarantees, they lack versatility.<n>In this paper, tools are offered from imprecise probability theory to build a formal connection between CP and generalized fiducial (GF) inference.
arXiv Detail & Related papers (2023-07-24T01:58:48Z) - Quantifying Deep Learning Model Uncertainty in Conformal Prediction [1.4685355149711297]
Conformal Prediction is a promising framework for representing the model uncertainty.
In this paper, we explore state-of-the-art CP methodologies and their theoretical foundations.
arXiv Detail & Related papers (2023-06-01T16:37:50Z) - When in Doubt: Neural Non-Parametric Uncertainty Quantification for
Epidemic Forecasting [70.54920804222031]
Most existing forecasting models disregard uncertainty quantification, resulting in mis-calibrated predictions.
Recent works in deep neural models for uncertainty-aware time-series forecasting also have several limitations.
We model the forecasting task as a probabilistic generative process and propose a functional neural process model called EPIFNP.
arXiv Detail & Related papers (2021-06-07T18:31:47Z) - DEUP: Direct Epistemic Uncertainty Prediction [56.087230230128185]
Epistemic uncertainty is part of out-of-sample prediction error due to the lack of knowledge of the learner.
We propose a principled approach for directly estimating epistemic uncertainty by learning to predict generalization error and subtracting an estimate of aleatoric uncertainty.
arXiv Detail & Related papers (2021-02-16T23:50:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.