Approximate Conditional Coverage via Neural Model Approximations
- URL: http://arxiv.org/abs/2205.14310v1
- Date: Sat, 28 May 2022 02:59:05 GMT
- Title: Approximate Conditional Coverage via Neural Model Approximations
- Authors: Allen Schmaltz and Danielle Rasooly
- Abstract summary: We analyze a data-driven procedure for obtaining empirically reliable approximate conditional coverage.
We demonstrate the potential for substantial (and otherwise unknowable) under-coverage with split-conformal alternatives with marginal coverage guarantees.
- Score: 0.030458514384586396
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Constructing reliable prediction sets is an obstacle for applications of
neural models: Distribution-free conditional coverage is theoretically
impossible, and the exchangeability assumption underpinning the coverage
guarantees of standard split-conformal approaches is violated on domain shifts.
Given these challenges, we propose and analyze a data-driven procedure for
obtaining empirically reliable approximate conditional coverage, calculating
unique quantile thresholds for each label for each test point. We achieve this
via the strong signals for prediction reliability from KNN-based model
approximations over the training set and approximations over constrained
samples from the held-out calibration set. We demonstrate the potential for
substantial (and otherwise unknowable) under-coverage with split-conformal
alternatives with marginal coverage guarantees when not taking these distances
and constraints into account with protein secondary structure prediction,
grammatical error detection, sentiment classification, and fact verification,
covering supervised sequence labeling, zero-shot sequence labeling (i.e.,
feature detection), document classification (with sparsity/interpretability
constraints), and retrieval-classification, including class-imbalanced and
domain-shifted settings.
Related papers
- Conditionally valid Probabilistic Conformal Prediction [57.80927226809277]
We develop a new method for creating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
We demonstrate the effectiveness of our approach through extensive simulations, showing that it outperforms existing methods in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Domain-adaptive and Subgroup-specific Cascaded Temperature Regression
for Out-of-distribution Calibration [16.930766717110053]
We propose a novel meta-set-based cascaded temperature regression method for post-hoc calibration.
We partition each meta-set into subgroups based on predicted category and confidence level, capturing diverse uncertainties.
A regression network is then trained to derive category-specific and confidence-level-specific scaling, achieving calibration across meta-sets.
arXiv Detail & Related papers (2024-02-14T14:35:57Z) - When Does Confidence-Based Cascade Deferral Suffice? [69.28314307469381]
Cascades are a classical strategy to enable inference cost to vary adaptively across samples.
A deferral rule determines whether to invoke the next classifier in the sequence, or to terminate prediction.
Despite being oblivious to the structure of the cascade, confidence-based deferral often works remarkably well in practice.
arXiv Detail & Related papers (2023-07-06T04:13:57Z) - Conformal Prediction with Missing Values [19.18178194789968]
We first show that the marginal coverage guarantee of conformal prediction holds on imputed data for any missingness distribution.
We then show that a universally consistent quantile regression algorithm trained on the imputed data is Bayes optimal for the pinball risk.
arXiv Detail & Related papers (2023-06-05T09:28:03Z) - Practical Adversarial Multivalid Conformal Prediction [27.179891682629183]
We give a generic conformal prediction method for sequential prediction.
It achieves target empirical coverage guarantees against adversarially chosen data.
It is computationally lightweight -- comparable to split conformal prediction.
arXiv Detail & Related papers (2022-06-02T14:33:00Z) - Predictive Inference with Weak Supervision [3.1925030748447747]
We bridge the gap between partial supervision and validation by developing a conformal prediction framework.
We introduce a new notion of coverage and predictive validity, then develop several application scenarios.
We corroborate the hypothesis that the new coverage definition allows for tighter and more informative (but valid) confidence sets.
arXiv Detail & Related papers (2022-01-20T17:26:52Z) - Training on Test Data with Bayesian Adaptation for Covariate Shift [96.3250517412545]
Deep neural networks often make inaccurate predictions with unreliable uncertainty estimates.
We derive a Bayesian model that provides for a well-defined relationship between unlabeled inputs under distributional shift and model parameters.
We show that our method improves both accuracy and uncertainty estimation.
arXiv Detail & Related papers (2021-09-27T01:09:08Z) - Distribution-free uncertainty quantification for classification under
label shift [105.27463615756733]
We focus on uncertainty quantification (UQ) for classification problems via two avenues.
We first argue that label shift hurts UQ, by showing degradation in coverage and calibration.
We examine these techniques theoretically in a distribution-free framework and demonstrate their excellent practical performance.
arXiv Detail & Related papers (2021-03-04T20:51:03Z) - Estimation and Applications of Quantiles in Deep Binary Classification [0.0]
Quantile regression, based on check loss, is a widely used inferential paradigm in Statistics.
We consider the analogue of check loss in the binary classification setting.
We develop individualized confidence scores that can be used to decide whether a prediction is reliable.
arXiv Detail & Related papers (2021-02-09T07:07:42Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Distribution-free binary classification: prediction sets, confidence
intervals and calibration [106.50279469344937]
We study three notions of uncertainty quantification -- calibration, confidence intervals and prediction sets -- for binary classification in the distribution-free setting.
We derive confidence intervals for binned probabilities for both fixed-width and uniform-mass binning.
As a consequence of our 'tripod' theorems, these confidence intervals for binned probabilities lead to distribution-free calibration.
arXiv Detail & Related papers (2020-06-18T14:17:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.