Adaptive Conformal Regression with Jackknife+ Rescaled Scores
- URL: http://arxiv.org/abs/2305.19901v1
- Date: Wed, 31 May 2023 14:32:26 GMT
- Title: Adaptive Conformal Regression with Jackknife+ Rescaled Scores
- Authors: Nicolas Deutschmann, Mattia Rigotti, Maria Rodriguez Martinez
- Abstract summary: Conformal regression provides prediction intervals with global coverage guarantees, but often fails to capture local error distributions.
We address this with a new adaptive method based on rescaling conformal scores with an estimate of local score distribution.
Our approach ensures formal global coverage guarantees and is supported by new theoretical results on local coverage, including an a posteriori bound on any calibration score.
- Score: 7.176758110912026
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Conformal regression provides prediction intervals with global coverage
guarantees, but often fails to capture local error distributions, leading to
non-homogeneous coverage. We address this with a new adaptive method based on
rescaling conformal scores with an estimate of local score distribution,
inspired by the Jackknife+ method, which enables the use of calibration data in
conformal scores without breaking calibration-test exchangeability. Our
approach ensures formal global coverage guarantees and is supported by new
theoretical results on local coverage, including an a posteriori bound on any
calibration score. The strength of our approach lies in achieving local
coverage without sacrificing calibration set size, improving the applicability
of conformal prediction intervals in various settings. As a result, our method
provides prediction intervals that outperform previous methods, particularly in
the low-data regime, making it especially relevant for real-world applications
such as healthcare and biomedical domains where uncertainty needs to be
quantified accurately despite low sample data.
Related papers
- Sparse Activations as Conformal Predictors [19.298282860984116]
We find a novel connection between conformal prediction and sparse softmax-like transformations.
We introduce new non-conformity scores for classification that make the calibration process correspond to the widely used temperature scaling method.
We show that the proposed method achieves competitive results in terms of coverage, efficiency, and adaptiveness.
arXiv Detail & Related papers (2025-02-20T17:53:41Z) - Noise-Adaptive Conformal Classification with Marginal Coverage [53.74125453366155]
We introduce an adaptive conformal inference method capable of efficiently handling deviations from exchangeability caused by random label noise.
We validate our method through extensive numerical experiments demonstrating its effectiveness on synthetic and real data sets.
arXiv Detail & Related papers (2025-01-29T23:55:23Z) - Adjusting Regression Models for Conditional Uncertainty Calibration [46.69079637538012]
We propose a novel algorithm to train a regression function to improve the conditional coverage after applying the split conformal prediction procedure.
We establish an upper bound for the miscoverage gap between the conditional coverage and the nominal coverage rate and propose an end-to-end algorithm to control this upper bound.
arXiv Detail & Related papers (2024-09-26T01:55:45Z) - Conformal Thresholded Intervals for Efficient Regression [9.559062601251464]
Conformal Thresholded Intervals (CTI) is a novel conformal regression method that aims to produce the smallest possible prediction set with guaranteed coverage.
CTI constructs prediction sets by thresholding the estimated conditional interquantile intervals based on their length.
CTI achieves superior performance compared to state-of-the-art conformal regression methods across various datasets.
arXiv Detail & Related papers (2024-07-19T17:47:08Z) - Domain-adaptive and Subgroup-specific Cascaded Temperature Regression
for Out-of-distribution Calibration [16.930766717110053]
We propose a novel meta-set-based cascaded temperature regression method for post-hoc calibration.
We partition each meta-set into subgroups based on predicted category and confidence level, capturing diverse uncertainties.
A regression network is then trained to derive category-specific and confidence-level-specific scaling, achieving calibration across meta-sets.
arXiv Detail & Related papers (2024-02-14T14:35:57Z) - Calibration by Distribution Matching: Trainable Kernel Calibration
Metrics [56.629245030893685]
We introduce kernel-based calibration metrics that unify and generalize popular forms of calibration for both classification and regression.
These metrics admit differentiable sample estimates, making it easy to incorporate a calibration objective into empirical risk minimization.
We provide intuitive mechanisms to tailor calibration metrics to a decision task, and enforce accurate loss estimation and no regret decisions.
arXiv Detail & Related papers (2023-10-31T06:19:40Z) - Parametric and Multivariate Uncertainty Calibration for Regression and
Object Detection [4.630093015127541]
We show that common detection models overestimate the spatial uncertainty in comparison to the observed error.
Our experiments show that the simple Isotonic Regression recalibration method is sufficient to achieve a good calibrated uncertainty.
In contrast, if normal distributions are required for subsequent processes, our GP-Normal recalibration method yields the best results.
arXiv Detail & Related papers (2022-07-04T08:00:20Z) - Scalable Marginal Likelihood Estimation for Model Selection in Deep
Learning [78.83598532168256]
Marginal-likelihood based model-selection is rarely used in deep learning due to estimation difficulties.
Our work shows that marginal likelihoods can improve generalization and be useful when validation data is unavailable.
arXiv Detail & Related papers (2021-04-11T09:50:24Z) - Localized Calibration: Metrics and Recalibration [133.07044916594361]
We propose a fine-grained calibration metric that spans the gap between fully global and fully individualized calibration.
We then introduce a localized recalibration method, LoRe, that improves the LCE better than existing recalibration methods.
arXiv Detail & Related papers (2021-02-22T07:22:12Z) - Privacy Preserving Recalibration under Domain Shift [119.21243107946555]
We introduce a framework that abstracts out the properties of recalibration problems under differential privacy constraints.
We also design a novel recalibration algorithm, accuracy temperature scaling, that outperforms prior work on private datasets.
arXiv Detail & Related papers (2020-08-21T18:43:37Z) - Calibration of Neural Networks using Splines [51.42640515410253]
Measuring calibration error amounts to comparing two empirical distributions.
We introduce a binning-free calibration measure inspired by the classical Kolmogorov-Smirnov (KS) statistical test.
Our method consistently outperforms existing methods on KS error as well as other commonly used calibration measures.
arXiv Detail & Related papers (2020-06-23T07:18:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.