Making Conformal Predictors Robust in Healthcare Settings: a Case Study on EEG Classification
- URL: http://arxiv.org/abs/2602.19483v1
- Date: Mon, 23 Feb 2026 03:59:32 GMT
- Title: Making Conformal Predictors Robust in Healthcare Settings: a Case Study on EEG Classification
- Authors: Arjun Chatterjee, Sayeed Sajjad Razin, John Wu, Siddhartha Laghuvarapu, Jathurshan Pradeepkumar, Jimeng Sun,
- Abstract summary: Quantifying uncertainty in clinical predictions is critical for high-stakes diagnosis tasks.<n>We evaluate several conformal prediction approaches on EEG seizure classification.<n>We demonstrate that personalized calibration strategies can improve coverage by over 20 percentage points.
- Score: 13.423110268586287
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Quantifying uncertainty in clinical predictions is critical for high-stakes diagnosis tasks. Conformal prediction offers a principled approach by providing prediction sets with theoretical coverage guarantees. However, in practice, patient distribution shifts violate the i.i.d. assumptions underlying standard conformal methods, leading to poor coverage in healthcare settings. In this work, we evaluate several conformal prediction approaches on EEG seizure classification, a task with known distribution shift challenges and label uncertainty. We demonstrate that personalized calibration strategies can improve coverage by over 20 percentage points while maintaining comparable prediction set sizes. Our implementation is available via PyHealth, an open-source healthcare AI framework: https://github.com/sunlabuiuc/PyHealth.
Related papers
- Conditional Coverage Diagnostics for Conformal Prediction [47.93989136542648]
We show that conditional coverage estimation can be a classification problem.<n>We call the resulting family of metrics excess risk of the target coverage (ERT)<n>We release an open-source package for ERT as well as previous conditional coverage metrics.
arXiv Detail & Related papers (2025-12-12T18:47:39Z) - Distribution-informed Online Conformal Prediction [53.674678995825666]
We propose Conformal Optimistic Prediction (COP), an online conformal prediction algorithm incorporating underlying data pattern into the update rule.<n>COP produces tighter prediction sets when predictable pattern exists, while retaining valid coverage guarantees even when estimates are inaccurate.<n>We prove that COP can achieve valid coverage and construct shorter prediction intervals than other baselines.
arXiv Detail & Related papers (2025-12-08T17:51:49Z) - Towards Trustworthy Vital Sign Forecasting: Leveraging Uncertainty for Prediction Intervals [32.233133404873016]
We present two methods for deriving PIs from the Reconstruction Uncertainty Estimate (RUE), an uncertainty measure well-suited to vital-sign forecasting.<n>We evaluate these methods on two large public datasets with minute- and hour-level sampling, representing high- and low-frequency health signals.
arXiv Detail & Related papers (2025-09-01T10:03:26Z) - Pitfalls of Conformal Predictions for Medical Image Classification [1.2289361708127877]
Con conformal predictions can provide provable calibration guarantees.<n>Con conformal predictions are unreliable under distributional shifts in input and label variables.<n>In classification settings with a small number of classes, conformal predictions have limited practical value.
arXiv Detail & Related papers (2025-06-22T20:33:38Z) - Optimal Conformal Prediction under Epistemic Uncertainty [61.46247583794497]
Conformal prediction (CP) is a popular framework for representing uncertainty.<n>We introduce Bernoulli prediction sets (BPS) which produce the smallest prediction sets that ensure conditional coverage.<n>When given first-order predictions, BPS reduces to the well-known adaptive prediction sets (APS)
arXiv Detail & Related papers (2025-05-25T08:32:44Z) - Conformal Prediction Sets with Improved Conditional Coverage using Trust Scores [52.92618442300405]
It is impossible to achieve exact, distribution-free conditional coverage in finite samples.<n>We propose an alternative conformal prediction algorithm that targets coverage where it matters most.
arXiv Detail & Related papers (2025-01-17T12:01:56Z) - A conformalized learning of a prediction set with applications to medical imaging classification [14.304858613146536]
We present an algorithm that can produce a prediction set containing the true label with a user-specified probability, such as 90%.
We applied the proposed algorithm to several standard medical imaging classification datasets.
arXiv Detail & Related papers (2024-08-09T12:49:04Z) - Equal Opportunity of Coverage in Fair Regression [50.76908018786335]
We study fair machine learning (ML) under predictive uncertainty to enable reliable and trustworthy decision-making.
We propose Equal Opportunity of Coverage (EOC) that aims to achieve two properties: (1) coverage rates for different groups with similar outcomes are close, and (2) the coverage rate for the entire population remains at a predetermined level.
arXiv Detail & Related papers (2023-11-03T21:19:59Z) - Distribution-free uncertainty quantification for classification under
label shift [105.27463615756733]
We focus on uncertainty quantification (UQ) for classification problems via two avenues.
We first argue that label shift hurts UQ, by showing degradation in coverage and calibration.
We examine these techniques theoretically in a distribution-free framework and demonstrate their excellent practical performance.
arXiv Detail & Related papers (2021-03-04T20:51:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.