Adaptive Temperature Scaling with Conformal Prediction
- URL: http://arxiv.org/abs/2505.15437v1
- Date: Wed, 21 May 2025 12:18:15 GMT
- Title: Adaptive Temperature Scaling with Conformal Prediction
- Authors: Nikita Kotelevskii, Mohsen Guizani, Eric Moulines, Maxim Panov,
- Abstract summary: We propose to the best of our knowledge the first method for assigning calibrated probabilities to elements of a conformal prediction set.<n>Our approach frames this as an adaptive calibration problem, selecting an input-specific temperature parameter to match the desired coverage level.
- Score: 47.51764759462074
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Conformal prediction enables the construction of high-coverage prediction sets for any pre-trained model, guaranteeing that the true label lies within the set with a specified probability. However, these sets do not provide probability estimates for individual labels, limiting their practical use. In this paper, we propose, to the best of our knowledge, the first method for assigning calibrated probabilities to elements of a conformal prediction set. Our approach frames this as an adaptive calibration problem, selecting an input-specific temperature parameter to match the desired coverage level. Experiments on several challenging image classification datasets demonstrate that our method maintains coverage guarantees while significantly reducing expected calibration error.
Related papers
- When Can We Reuse a Calibration Set for Multiple Conformal Predictions? [0.0]
We show how e-conformal prediction, in conjunction with Hoeffding's inequality, can enable the repeated use of a single calibration set.<n>We train a deep neural network and utilise a calibration set to estimate a Hoeffding correction.<n>This correction allows us to apply a modified Markov's inequality, leading to the construction of prediction sets with quantifiable confidence.
arXiv Detail & Related papers (2025-06-24T14:57:25Z) - Semi-Supervised Conformal Prediction With Unlabeled Nonconformity Score [19.15617038007535]
Conformal prediction (CP) is a powerful framework for uncertainty quantification.<n>In real-world applications where labeled data is often limited, standard CP can lead to coverage deviation and output overly large prediction sets.<n>We propose SemiCP, leveraging both labeled data and unlabeled data for calibration.
arXiv Detail & Related papers (2025-05-27T12:57:44Z) - Sparse Activations as Conformal Predictors [19.298282860984116]
We find a novel connection between conformal prediction and sparse softmax-like transformations.<n>We introduce new non-conformity scores for classification that make the calibration process correspond to the widely used temperature scaling method.<n>We show that the proposed method achieves competitive results in terms of coverage, efficiency, and adaptiveness.
arXiv Detail & Related papers (2025-02-20T17:53:41Z) - Conformal Prediction Sets with Improved Conditional Coverage using Trust Scores [52.92618442300405]
It is impossible to achieve exact, distribution-free conditional coverage in finite samples.<n>We propose an alternative conformal prediction algorithm that targets coverage where it matters most.
arXiv Detail & Related papers (2025-01-17T12:01:56Z) - Provably Reliable Conformal Prediction Sets in the Presence of Data Poisoning [53.42244686183879]
Conformal prediction provides model-agnostic and distribution-free uncertainty quantification.<n>Yet, conformal prediction is not reliable under poisoning attacks where adversaries manipulate both training and calibration data.<n>We propose reliable prediction sets (RPS): the first efficient method for constructing conformal prediction sets with provable reliability guarantees under poisoning.
arXiv Detail & Related papers (2024-10-13T15:37:11Z) - Conformal Generative Modeling with Improved Sample Efficiency through Sequential Greedy Filtering [55.15192437680943]
Generative models lack rigorous statistical guarantees for their outputs.<n>We propose a sequential conformal prediction method producing prediction sets that satisfy a rigorous statistical guarantee.<n>This guarantee states that with high probability, the prediction sets contain at least one admissible (or valid) example.
arXiv Detail & Related papers (2024-10-02T15:26:52Z) - A conformalized learning of a prediction set with applications to medical imaging classification [14.304858613146536]
We present an algorithm that can produce a prediction set containing the true label with a user-specified probability, such as 90%.
We applied the proposed algorithm to several standard medical imaging classification datasets.
arXiv Detail & Related papers (2024-08-09T12:49:04Z) - Does confidence calibration improve conformal prediction? [10.340903334800787]
We show that current confidence calibration methods lead to larger prediction sets in adaptive conformal prediction.<n>By investigating the role of temperature value, we observe that high-confidence predictions can enhance the efficiency of adaptive conformal prediction.<n>We propose Conformal Temperature Scaling (ConfTS), a variant of temperature scaling with a novel loss function designed to enhance the efficiency of prediction sets.
arXiv Detail & Related papers (2024-02-06T19:27:48Z) - Test-time Recalibration of Conformal Predictors Under Distribution Shift
Based on Unlabeled Examples [30.61588337557343]
Conformal predictors provide uncertainty estimates by computing a set of classes with a user-specified probability.
We propose a method that provides excellent uncertainty estimates under natural distribution shifts.
arXiv Detail & Related papers (2022-10-09T04:46:00Z) - Sample-dependent Adaptive Temperature Scaling for Improved Calibration [95.7477042886242]
Post-hoc approach to compensate for neural networks being wrong is to perform temperature scaling.
We propose to predict a different temperature value for each input, allowing us to adjust the mismatch between confidence and accuracy.
We test our method on the ResNet50 and WideResNet28-10 architectures using the CIFAR10/100 and Tiny-ImageNet datasets.
arXiv Detail & Related papers (2022-07-13T14:13:49Z) - Private Prediction Sets [72.75711776601973]
Machine learning systems need reliable uncertainty quantification and protection of individuals' privacy.
We present a framework that treats these two desiderata jointly.
We evaluate the method on large-scale computer vision datasets.
arXiv Detail & Related papers (2021-02-11T18:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.