Probabilistically robust conformal prediction
- URL: http://arxiv.org/abs/2307.16360v1
- Date: Mon, 31 Jul 2023 01:32:06 GMT
- Title: Probabilistically robust conformal prediction
- Authors: Subhankar Ghosh, Yuanjie Shi, Taha Belkhouja, Yan Yan, Jana Doppa,
Brian Jones
- Abstract summary: Conformal prediction (CP) is a framework to quantify uncertainty of machine learning classifiers including deep neural networks.
Almost all the existing work on CP assumes clean testing data and there is not much known about the robustness of CP algorithms.
This paper studies the problem of probabilistically robust conformal prediction (PRCP) which ensures robustness to most perturbations.
- Score: 9.401004747930974
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Conformal prediction (CP) is a framework to quantify uncertainty of machine
learning classifiers including deep neural networks. Given a testing example
and a trained classifier, CP produces a prediction set of candidate labels with
a user-specified coverage (i.e., true class label is contained with high
probability). Almost all the existing work on CP assumes clean testing data and
there is not much known about the robustness of CP algorithms w.r.t
natural/adversarial perturbations to testing examples. This paper studies the
problem of probabilistically robust conformal prediction (PRCP) which ensures
robustness to most perturbations around clean input examples. PRCP generalizes
the standard CP (cannot handle perturbations) and adversarially robust CP
(ensures robustness w.r.t worst-case perturbations) to achieve better
trade-offs between nominal performance and robustness. We propose a novel
adaptive PRCP (aPRCP) algorithm to achieve probabilistically robust coverage.
The key idea behind aPRCP is to determine two parallel thresholds, one for data
samples and another one for the perturbations on data (aka
"quantile-of-quantile" design). We provide theoretical analysis to show that
aPRCP algorithm achieves robust coverage. Our experiments on CIFAR-10,
CIFAR-100, and ImageNet datasets using deep neural networks demonstrate that
aPRCP achieves better trade-offs than state-of-the-art CP and adversarially
robust CP algorithms.
Related papers
- Robust Yet Efficient Conformal Prediction Sets [53.78604391939934]
Conformal prediction (CP) can convert any model's output into prediction sets guaranteed to include the true label.
We derive provably robust sets by bounding the worst-case change in conformity scores.
arXiv Detail & Related papers (2024-07-12T10:59:44Z) - Verifiably Robust Conformal Prediction [1.391198481393699]
This paper introduces VRCP (Verifiably Robust Conformal Prediction), a new framework that leverages neural network verification methods to recover coverage guarantees under adversarial attacks.
Our method is the first to support perturbations bounded by arbitrary norms including $ell1$, $ell2$, and $ellinfty$, as well as regression tasks.
In every case, VRCP achieves above nominal coverage and yields significantly more efficient and informative prediction regions than the SotA.
arXiv Detail & Related papers (2024-05-29T09:50:43Z) - Efficient Conformal Prediction under Data Heterogeneity [79.35418041861327]
Conformal Prediction (CP) stands out as a robust framework for uncertainty quantification.
Existing approaches for tackling non-exchangeability lead to methods that are not computable beyond the simplest examples.
This work introduces a new efficient approach to CP that produces provably valid confidence sets for fairly general non-exchangeable data distributions.
arXiv Detail & Related papers (2023-12-25T20:02:51Z) - PAC Prediction Sets Under Label Shift [52.30074177997787]
Prediction sets capture uncertainty by predicting sets of labels rather than individual labels.
We propose a novel algorithm for constructing prediction sets with PAC guarantees in the label shift setting.
We evaluate our approach on five datasets.
arXiv Detail & Related papers (2023-10-19T17:57:57Z) - RR-CP: Reliable-Region-Based Conformal Prediction for Trustworthy
Medical Image Classification [24.52922162675259]
Conformal prediction (CP) generates a set of predictions for a given test sample.
The size of the set indicates how certain the predictions are.
We propose a new method called Reliable-Region-Based Conformal Prediction (RR-CP)
arXiv Detail & Related papers (2023-09-09T11:14:04Z) - Improving Uncertainty Quantification of Deep Classifiers via
Neighborhood Conformal Prediction: Novel Algorithm and Theoretical Analysis [30.0231328500976]
Conformal prediction (CP) is a principled framework for uncertainty quantification of deep models.
This paper proposes a novel algorithm referred to as Neighborhood Conformal Prediction (NCP) to improve the efficiency of uncertainty quantification.
We show that NCP leads to significant reduction in prediction set size over prior CP methods.
arXiv Detail & Related papers (2023-03-19T15:56:50Z) - Probabilistic Conformal Prediction Using Conditional Random Samples [73.26753677005331]
PCP is a predictive inference algorithm that estimates a target variable by a discontinuous predictive set.
It is efficient and compatible with either explicit or implicit conditional generative models.
arXiv Detail & Related papers (2022-06-14T03:58:03Z) - A Confidence Machine for Sparse High-Order Interaction Model [16.780058676633914]
Conformal prediction (CP) is a promising approach for obtaining the confidence of prediction results with fewer theoretical assumptions.
We develop a full-CP of sparse high-order interaction model (SHIM) which is sufficiently flexible as it can take into account high-order interactions among variables.
arXiv Detail & Related papers (2022-05-28T03:23:56Z) - Approximating Full Conformal Prediction at Scale via Influence Functions [30.391742057634264]
Conformal prediction (CP) is a wrapper around traditional machine learning models.
In this paper, we use influence functions to efficiently approximate full CP.
arXiv Detail & Related papers (2022-02-02T22:38:40Z) - Selective Classification via One-Sided Prediction [54.05407231648068]
One-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime.
We theoretically derive bounds generalization for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels.
arXiv Detail & Related papers (2020-10-15T16:14:27Z) - Evaluating probabilistic classifiers: Reliability diagrams and score
decompositions revisited [68.8204255655161]
We introduce the CORP approach, which generates provably statistically Consistent, Optimally binned, and Reproducible reliability diagrams in an automated way.
Corpor is based on non-parametric isotonic regression and implemented via the Pool-adjacent-violators (PAV) algorithm.
arXiv Detail & Related papers (2020-08-07T08:22:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.