Optimized conformal classification using gradient descent approximation
- URL: http://arxiv.org/abs/2105.11255v1
- Date: Mon, 24 May 2021 13:14:41 GMT
- Title: Optimized conformal classification using gradient descent approximation
- Authors: Anthony Bellotti
- Abstract summary: Conformal predictors allow predictions to be made with a user-defined confidence level.
We consider an approach to train the conformal predictor directly with maximum predictive efficiency.
We test the method on several real world data sets and find that the method is promising.
- Score: 0.2538209532048866
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Conformal predictors are an important class of algorithms that allow
predictions to be made with a user-defined confidence level. They are able to
do this by outputting prediction sets, rather than simple point predictions.
The conformal predictor is valid in the sense that the accuracy of its
predictions is guaranteed to meet the confidence level, only assuming
exchangeability in the data. Since accuracy is guaranteed, the performance of a
conformal predictor is measured through the efficiency of the prediction sets.
Typically, a conformal predictor is built on an underlying machine learning
algorithm and hence its predictive power is inherited from this algorithm.
However, since the underlying machine learning algorithm is not trained with
the objective of minimizing predictive efficiency it means that the resulting
conformal predictor may be sub-optimal and not aligned sufficiently to this
objective. Hence, in this study we consider an approach to train the conformal
predictor directly with maximum predictive efficiency as the optimization
objective, and we focus specifically on the inductive conformal predictor for
classification. To do this, the conformal predictor is approximated by a
differentiable objective function and gradient descent used to optimize it. The
resulting parameter estimates are then passed to a proper inductive conformal
predictor to give valid prediction sets. We test the method on several real
world data sets and find that the method is promising and in most cases gives
improved predictive efficiency against a baseline conformal predictor.
Related papers
- Provably Reliable Conformal Prediction Sets in the Presence of Data Poisoning [53.42244686183879]
Conformal prediction provides model-agnostic and distribution-free uncertainty quantification.
Yet, conformal prediction is not reliable under poisoning attacks where adversaries manipulate both training and calibration data.
We propose reliable prediction sets (RPS): the first efficient method for constructing conformal prediction sets with provable reliability guarantees under poisoning.
arXiv Detail & Related papers (2024-10-13T15:37:11Z) - Beyond Conformal Predictors: Adaptive Conformal Inference with Confidence Predictors [0.0]
Conformal prediction requires exchangeable data to ensure valid prediction sets at a user-specified significance level.
Adaptive conformal inference (ACI) was introduced to address this limitation.
We show that ACI does not require the use of conformal predictors; instead, it can be implemented with the more general confidence predictors.
arXiv Detail & Related papers (2024-09-23T21:02:33Z) - Towards Human-AI Complementarity with Prediction Sets [14.071862670474832]
Decision support systems based on prediction sets have proven to be effective at helping human experts solve classification tasks.
We show that the prediction sets constructed using conformal prediction are, in general, suboptimal in terms of average accuracy.
We introduce a greedy algorithm that, for a large class of expert models and non-optimal scores, is guaranteed to find prediction sets that provably offer equal or greater performance.
arXiv Detail & Related papers (2024-05-27T18:00:00Z) - Delving into temperature scaling for adaptive conformal prediction [10.340903334800787]
Conformal prediction, as an emerging uncertainty qualification technique, constructs prediction sets that are guaranteed to contain the true label with pre-defined probability.
We show that current confidence calibration methods (e.g., temperature scaling) normally lead to larger prediction sets in adaptive conformal prediction.
We propose $Conformal$ $Temperature$ $Scaling$ (ConfTS), a variant of temperature scaling that aims to improve the efficiency of adaptive conformal prediction.
arXiv Detail & Related papers (2024-02-06T19:27:48Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Predictive Inference with Feature Conformal Prediction [80.77443423828315]
We propose feature conformal prediction, which extends the scope of conformal prediction to semantic feature spaces.
From a theoretical perspective, we demonstrate that feature conformal prediction provably outperforms regular conformal prediction under mild assumptions.
Our approach could be combined with not only vanilla conformal prediction, but also other adaptive conformal prediction methods.
arXiv Detail & Related papers (2022-10-01T02:57:37Z) - Efficient and Differentiable Conformal Prediction with General Function
Classes [96.74055810115456]
We propose a generalization of conformal prediction to multiple learnable parameters.
We show that it achieves approximate valid population coverage and near-optimal efficiency within class.
Experiments show that our algorithm is able to learn valid prediction sets and improve the efficiency significantly.
arXiv Detail & Related papers (2022-02-22T18:37:23Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z) - Private Prediction Sets [72.75711776601973]
Machine learning systems need reliable uncertainty quantification and protection of individuals' privacy.
We present a framework that treats these two desiderata jointly.
We evaluate the method on large-scale computer vision datasets.
arXiv Detail & Related papers (2021-02-11T18:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.