Approximation to Object Conditional Validity with Conformal Predictors
- URL: http://arxiv.org/abs/2102.07436v1
- Date: Mon, 15 Feb 2021 10:14:44 GMT
- Title: Approximation to Object Conditional Validity with Conformal Predictors
- Authors: Anthony Bellotti
- Abstract summary: Conformal predictors are machine learning algorithms that output prediction intervals that have a guarantee of marginal validity for finite samples.
It has been shown that such conditional validity is impossible to guarantee for non-trivial prediction problems for finite samples.
In this article, instead of trying to achieve a strong conditional validity result, the weaker goal of achieving an approximation to conditional validity is considered.
- Score: 0.2538209532048866
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Conformal predictors are machine learning algorithms that output prediction
intervals that have a guarantee of marginal validity for finite samples with
minimal distributional assumptions. This is a property that makes conformal
predictors useful for machine learning tasks where we require reliable
predictions. It would also be desirable to achieve conditional validity in the
same setting, in the sense that validity of the prediction intervals remains
valid regardless of conditioning on any property of the object of the
prediction. Unfortunately, it has been shown that such conditional validity is
impossible to guarantee for non-trivial prediction problems for finite samples.
In this article, instead of trying to achieve a strong conditional validity
result, the weaker goal of achieving an approximation to conditional validity
is considered. A new algorithm is introduced to do this by iteratively
adjusting a conformity measure to deviations from object conditional validity
measured in the training data. Along with some theoretical results,
experimental results are provided for three data sets that demonstrate (1) in
real world machine learning tasks, lack of conditional validity is a measurable
problem and (2) that the proposed algorithm is effective at alleviating this
problem.
Related papers
- Conformal Generative Modeling with Improved Sample Efficiency through Sequential Greedy Filtering [55.15192437680943]
Generative models lack rigorous statistical guarantees for their outputs.
We propose a sequential conformal prediction method producing prediction sets that satisfy a rigorous statistical guarantee.
This guarantee states that with high probability, the prediction sets contain at least one admissible (or valid) example.
arXiv Detail & Related papers (2024-10-02T15:26:52Z) - Beyond Conformal Predictors: Adaptive Conformal Inference with Confidence Predictors [0.0]
Conformal prediction requires exchangeable data to ensure valid prediction sets at a user-specified significance level.
Adaptive conformal inference (ACI) was introduced to address this limitation.
We show that ACI does not require the use of conformal predictors; instead, it can be implemented with the more general confidence predictors.
arXiv Detail & Related papers (2024-09-23T21:02:33Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Length Optimization in Conformal Prediction [22.733758606168873]
We develop Conformal Prediction with Length-Optimization (CPL) as a principled framework for conformal prediction.
CPL constructs prediction sets with (near-) optimal length while ensuring conditional validity.
Our empirical evaluations demonstrate the superior prediction set size performance of CPL compared to state-of-the-art methods.
arXiv Detail & Related papers (2024-06-27T01:08:04Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Predictive Inference with Feature Conformal Prediction [80.77443423828315]
We propose feature conformal prediction, which extends the scope of conformal prediction to semantic feature spaces.
From a theoretical perspective, we demonstrate that feature conformal prediction provably outperforms regular conformal prediction under mild assumptions.
Our approach could be combined with not only vanilla conformal prediction, but also other adaptive conformal prediction methods.
arXiv Detail & Related papers (2022-10-01T02:57:37Z) - Practical Adversarial Multivalid Conformal Prediction [27.179891682629183]
We give a generic conformal prediction method for sequential prediction.
It achieves target empirical coverage guarantees against adversarially chosen data.
It is computationally lightweight -- comparable to split conformal prediction.
arXiv Detail & Related papers (2022-06-02T14:33:00Z) - Efficient and Differentiable Conformal Prediction with General Function
Classes [96.74055810115456]
We propose a generalization of conformal prediction to multiple learnable parameters.
We show that it achieves approximate valid population coverage and near-optimal efficiency within class.
Experiments show that our algorithm is able to learn valid prediction sets and improve the efficiency significantly.
arXiv Detail & Related papers (2022-02-22T18:37:23Z) - Conformal prediction for the design problem [72.14982816083297]
In many real-world deployments of machine learning, we use a prediction algorithm to choose what data to test next.
In such settings, there is a distinct type of distribution shift between the training and test data.
We introduce a method to quantify predictive uncertainty in such settings.
arXiv Detail & Related papers (2022-02-08T02:59:12Z) - Optimized conformal classification using gradient descent approximation [0.2538209532048866]
Conformal predictors allow predictions to be made with a user-defined confidence level.
We consider an approach to train the conformal predictor directly with maximum predictive efficiency.
We test the method on several real world data sets and find that the method is promising.
arXiv Detail & Related papers (2021-05-24T13:14:41Z) - Robust Validation: Confident Predictions Even When Distributions Shift [19.327409270934474]
We describe procedures for robust predictive inference, where a model provides uncertainty estimates on its predictions rather than point predictions.
We present a method that produces prediction sets (almost exactly) giving the right coverage level for any test distribution in an $f$-divergence ball around the training population.
An essential component of our methodology is to estimate the amount of expected future data shift and build robustness to it.
arXiv Detail & Related papers (2020-08-10T17:09:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.