Practical Adversarial Multivalid Conformal Prediction
- URL: http://arxiv.org/abs/2206.01067v1
- Date: Thu, 2 Jun 2022 14:33:00 GMT
- Title: Practical Adversarial Multivalid Conformal Prediction
- Authors: Osbert Bastani, Varun Gupta, Christopher Jung, Georgy Noarov, Ramya
Ramalingam, Aaron Roth
- Abstract summary: We give a generic conformal prediction method for sequential prediction.
It achieves target empirical coverage guarantees against adversarially chosen data.
It is computationally lightweight -- comparable to split conformal prediction.
- Score: 27.179891682629183
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We give a simple, generic conformal prediction method for sequential
prediction that achieves target empirical coverage guarantees against
adversarially chosen data. It is computationally lightweight -- comparable to
split conformal prediction -- but does not require having a held-out validation
set, and so all data can be used for training models from which to derive a
conformal score. It gives stronger than marginal coverage guarantees in two
ways. First, it gives threshold calibrated prediction sets that have correct
empirical coverage even conditional on the threshold used to form the
prediction set from the conformal score. Second, the user can specify an
arbitrary collection of subsets of the feature space -- possibly intersecting
-- and the coverage guarantees also hold conditional on membership in each of
these subsets. We call our algorithm MVP, short for MultiValid Prediction. We
give both theory and an extensive set of empirical evaluations.
Related papers
- Provably Reliable Conformal Prediction Sets in the Presence of Data Poisoning [53.42244686183879]
Conformal prediction provides model-agnostic and distribution-free uncertainty quantification.
Yet, conformal prediction is not reliable under poisoning attacks where adversaries manipulate both training and calibration data.
We propose reliable prediction sets (RPS): the first efficient method for constructing conformal prediction sets with provable reliability guarantees under poisoning.
arXiv Detail & Related papers (2024-10-13T15:37:11Z) - Robust Yet Efficient Conformal Prediction Sets [53.78604391939934]
Conformal prediction (CP) can convert any model's output into prediction sets guaranteed to include the true label.
We derive provably robust sets by bounding the worst-case change in conformity scores.
arXiv Detail & Related papers (2024-07-12T10:59:44Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Distribution-free Conformal Prediction for Ordinal Classification [0.0]
Ordinal classification is common in real applications where the target variable has natural ordering among the class labels.
New conformal prediction methods are developed for constructing contiguous and non-contiguous prediction sets.
arXiv Detail & Related papers (2024-04-25T13:49:59Z) - PAC Prediction Sets Under Label Shift [52.30074177997787]
Prediction sets capture uncertainty by predicting sets of labels rather than individual labels.
We propose a novel algorithm for constructing prediction sets with PAC guarantees in the label shift setting.
We evaluate our approach on five datasets.
arXiv Detail & Related papers (2023-10-19T17:57:57Z) - Class-Conditional Conformal Prediction with Many Classes [60.8189977620604]
We propose a method called clustered conformal prediction that clusters together classes having "similar" conformal scores.
We find that clustered conformal typically outperforms existing methods in terms of class-conditional coverage and set size metrics.
arXiv Detail & Related papers (2023-06-15T17:59:02Z) - Post-selection Inference for Conformal Prediction: Trading off Coverage
for Precision [0.0]
Traditionally, conformal prediction inference requires a data-independent specification of miscoverage level.
We develop simultaneous conformal inference to account for data-dependent miscoverage levels.
arXiv Detail & Related papers (2023-04-12T20:56:43Z) - Approximate Conditional Coverage via Neural Model Approximations [0.030458514384586396]
We analyze a data-driven procedure for obtaining empirically reliable approximate conditional coverage.
We demonstrate the potential for substantial (and otherwise unknowable) under-coverage with split-conformal alternatives with marginal coverage guarantees.
arXiv Detail & Related papers (2022-05-28T02:59:05Z) - Conformal Prediction Sets with Limited False Positives [43.596058175459746]
We develop a new approach to multi-label conformal prediction in which we aim to output a precise set of promising prediction candidates with a bounded number of incorrect answers.
We demonstrate the effectiveness of this approach across a number of classification tasks in natural language processing, computer vision, and computational chemistry.
arXiv Detail & Related papers (2022-02-15T18:52:33Z) - Private Prediction Sets [72.75711776601973]
Machine learning systems need reliable uncertainty quantification and protection of individuals' privacy.
We present a framework that treats these two desiderata jointly.
We evaluate the method on large-scale computer vision datasets.
arXiv Detail & Related papers (2021-02-11T18:59:11Z) - Efficient Conformal Prediction via Cascaded Inference with Expanded
Admission [43.596058175459746]
We present a novel approach for conformal prediction (CP)
We aim to identify a set of promising prediction candidates -- in place of a single prediction.
This set is guaranteed to contain a correct answer with high probability.
arXiv Detail & Related papers (2020-07-06T23:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.