Enhancing Conformal Prediction via Class Similarity
- URL: http://arxiv.org/abs/2511.19359v1
- Date: Mon, 24 Nov 2025 17:56:42 GMT
- Title: Enhancing Conformal Prediction via Class Similarity
- Authors: Ariel Fargion, Lahav Dabah, Tom Tirer,
- Abstract summary: Conformal Prediction (CP) has emerged as a powerful statistical framework for high-stakes classification applications.<n>We propose augmenting the CP score function with a term that penalizes predictions with out-of-group errors.<n>We show mathematically that, for common class partitions, it can also reduce the average set size of any CP score function.
- Score: 13.38174941551702
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conformal Prediction (CP) has emerged as a powerful statistical framework for high-stakes classification applications. Instead of predicting a single class, CP generates a prediction set, guaranteed to include the true label with a pre-specified probability. The performance of different CP methods is typically assessed by their average prediction set size. In setups where the classes can be partitioned into semantic groups, e.g., diseases that require similar treatment, users can benefit from prediction sets that are not only small on average, but also contain a small number of semantically different groups. This paper begins by addressing this problem and ultimately offers a widely applicable tool for boosting any CP method on any dataset. First, given a class partition, we propose augmenting the CP score function with a term that penalizes predictions with out-of-group errors. We theoretically analyze this strategy and prove its advantages for group-related metrics. Surprisingly, we show mathematically that, for common class partitions, it can also reduce the average set size of any CP score function. Our analysis reveals the class similarity factors behind this improvement and motivates us to propose a model-specific variant, which does not require any human semantic partition and can further reduce the prediction set size. Finally, we present an extensive empirical study, encompassing prominent CP methods, multiple models, and several datasets, which demonstrates that our class-similarity-based approach consistently enhances CP methods.
Related papers
- Provably Minimum-Length Conformal Prediction Sets for Ordinal Classification [17.822517655243427]
We propose an ordinal-CP method that is model-agnostic and provides instance-level optimal prediction intervals.<n>Experiments on four benchmark datasets from diverse domains are conducted to demonstrate the significantly improved predictive efficiency.
arXiv Detail & Related papers (2025-11-20T23:00:15Z) - Hierarchical Conformal Classification [5.964388602612373]
Conformal prediction (CP) is a powerful framework for quantifying uncertainty in machine learning models.<n>Standard CP treats classes as flat and unstructured, ignoring relationships such as semantic or hierarchical structure among class labels.<n>This paper presents hierarchical HCCal classification (HCC), an extension of CP that incorporates class hierarchies into both the structure and semantics of prediction sets.
arXiv Detail & Related papers (2025-08-18T18:05:55Z) - One Sample is Enough to Make Conformal Prediction Robust [53.78604391939934]
We show that conformal prediction attains some robustness even with a forward pass on a single randomly perturbed input.<n>Our approach returns robust sets with smaller average set size compared to SOTA methods which use many (e.g. around 100) passes per input.
arXiv Detail & Related papers (2025-06-19T19:14:25Z) - Project-Probe-Aggregate: Efficient Fine-Tuning for Group Robustness [61.45587642780908]
We propose a three-step approach for parameter-efficient fine-tuning of image-text foundation models.<n>Our method improves its two key components: minority samples identification and the robust training algorithm.<n>Our theoretical analysis shows that our PPA enhances minority group identification and is Bayes optimal for minimizing the balanced group error.
arXiv Detail & Related papers (2025-03-12T15:46:12Z) - Probably Approximately Precision and Recall Learning [60.00180898830079]
A key challenge in machine learning is the prevalence of one-sided feedback.<n>We introduce a Probably Approximately Correct (PAC) framework in which hypotheses are set functions that map each input to a set of labels.<n>We develop new algorithms that learn from positive data alone, achieving optimal sample complexity in the realizable case.
arXiv Detail & Related papers (2024-11-20T04:21:07Z) - On Temperature Scaling and Conformal Prediction of Deep Classifiers [9.975341265604577]
Conformal Prediction (CP) produces a prediction set of candidate labels that contains the true label with a user-specified probability.<n>We show that while Temperature Scaling (TS) calibration improves the class-conditional coverage of adaptive CP methods, surprisingly, it negatively affects their prediction set sizes.<n>We propose guidelines for practitioners to effectively combine adaptive CP with calibration, aligned with user-defined goals.
arXiv Detail & Related papers (2024-02-08T16:45:12Z) - RR-CP: Reliable-Region-Based Conformal Prediction for Trustworthy
Medical Image Classification [24.52922162675259]
Conformal prediction (CP) generates a set of predictions for a given test sample.
The size of the set indicates how certain the predictions are.
We propose a new method called Reliable-Region-Based Conformal Prediction (RR-CP)
arXiv Detail & Related papers (2023-09-09T11:14:04Z) - Class-Conditional Conformal Prediction with Many Classes [60.8189977620604]
We propose a method called clustered conformal prediction that clusters together classes having "similar" conformal scores.
We find that clustered conformal typically outperforms existing methods in terms of class-conditional coverage and set size metrics.
arXiv Detail & Related papers (2023-06-15T17:59:02Z) - A Cross-Conformal Predictor for Multi-label Classification [0.0]
In multi-label learning each instance is associated with multiple classes simultaneously.
This work examines the application of a recently developed framework called Conformal Prediction to the multi-label learning setting.
arXiv Detail & Related papers (2022-11-29T14:21:49Z) - Efficient and Differentiable Conformal Prediction with General Function
Classes [96.74055810115456]
We propose a generalization of conformal prediction to multiple learnable parameters.
We show that it achieves approximate valid population coverage and near-optimal efficiency within class.
Experiments show that our algorithm is able to learn valid prediction sets and improve the efficiency significantly.
arXiv Detail & Related papers (2022-02-22T18:37:23Z) - Selective Classification via One-Sided Prediction [54.05407231648068]
One-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime.
We theoretically derive bounds generalization for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels.
arXiv Detail & Related papers (2020-10-15T16:14:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.