Distribution-Free Model-Agnostic Regression Calibration via
Nonparametric Methods
- URL: http://arxiv.org/abs/2305.12283v2
- Date: Thu, 26 Oct 2023 01:59:37 GMT
- Title: Distribution-Free Model-Agnostic Regression Calibration via
Nonparametric Methods
- Authors: Shang Liu, Zhongze Cai, Xiaocheng Li
- Abstract summary: We consider an individual calibration objective for characterizing the quantiles of the prediction model.
Existing methods have been largely and lack of statistical guarantee in terms of individual calibration.
We propose simple nonparametric calibration methods that are agnostic of the underlying prediction model.
- Score: 9.662269016653296
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we consider the uncertainty quantification problem for
regression models. Specifically, we consider an individual calibration
objective for characterizing the quantiles of the prediction model. While such
an objective is well-motivated from downstream tasks such as newsvendor cost,
the existing methods have been largely heuristic and lack of statistical
guarantee in terms of individual calibration. We show via simple examples that
the existing methods focusing on population-level calibration guarantees such
as average calibration or sharpness can lead to harmful and unexpected results.
We propose simple nonparametric calibration methods that are agnostic of the
underlying prediction model and enjoy both computational efficiency and
statistical consistency. Our approach enables a better understanding of the
possibility of individual calibration, and we establish matching upper and
lower bounds for the calibration error of our proposed methods. Technically,
our analysis combines the nonparametric analysis with a covering number
argument for parametric analysis, which advances the existing theoretical
analyses in the literature of nonparametric density estimation and quantile
bandit problems. Importantly, the nonparametric perspective sheds new
theoretical insights into regression calibration in terms of the curse of
dimensionality and reconciles the existing results on the impossibility of
individual calibration. To our knowledge, we make the first effort to reach
both individual calibration and finite-sample guarantee with minimal
assumptions in terms of conformal prediction. Numerical experiments show the
advantage of such a simple approach under various metrics, and also under
covariates shift. We hope our work provides a simple benchmark and a starting
point of theoretical ground for future research on regression calibration.
Related papers
- Probabilistic Calibration by Design for Neural Network Regression [2.3020018305241337]
We introduce a novel end-to-end model training procedure called Quantile Recalibration Training.
We demonstrate the performance of our method in a large-scale experiment involving 57 regression datasets.
arXiv Detail & Related papers (2024-03-18T17:04:33Z) - Calibration by Distribution Matching: Trainable Kernel Calibration
Metrics [56.629245030893685]
We introduce kernel-based calibration metrics that unify and generalize popular forms of calibration for both classification and regression.
These metrics admit differentiable sample estimates, making it easy to incorporate a calibration objective into empirical risk minimization.
We provide intuitive mechanisms to tailor calibration metrics to a decision task, and enforce accurate loss estimation and no regret decisions.
arXiv Detail & Related papers (2023-10-31T06:19:40Z) - Selective Nonparametric Regression via Testing [54.20569354303575]
We develop an abstention procedure via testing the hypothesis on the value of the conditional variance at a given point.
Unlike existing methods, the proposed one allows to account not only for the value of the variance itself but also for the uncertainty of the corresponding variance predictor.
arXiv Detail & Related papers (2023-09-28T13:04:11Z) - Calibration of Neural Networks [77.34726150561087]
This paper presents a survey of confidence calibration problems in the context of neural networks.
We analyze problem statement, calibration definitions, and different approaches to evaluation.
Empirical experiments cover various datasets and models, comparing calibration methods according to different criteria.
arXiv Detail & Related papers (2023-03-19T20:27:51Z) - Sharp Calibrated Gaussian Processes [58.94710279601622]
State-of-the-art approaches for designing calibrated models rely on inflating the Gaussian process posterior variance.
We present a calibration approach that generates predictive quantiles using a computation inspired by the vanilla Gaussian process posterior variance.
Our approach is shown to yield a calibrated model under reasonable assumptions.
arXiv Detail & Related papers (2023-02-23T12:17:36Z) - On Calibrating Semantic Segmentation Models: Analyses and An Algorithm [51.85289816613351]
We study the problem of semantic segmentation calibration.
Model capacity, crop size, multi-scale testing, and prediction correctness have impact on calibration.
We propose a simple, unifying, and effective approach, namely selective scaling.
arXiv Detail & Related papers (2022-12-22T22:05:16Z) - Calibration tests beyond classification [30.616624345970973]
Most supervised machine learning tasks are subject to irreducible prediction errors.
Probabilistic predictive models address this limitation by providing probability distributions that represent a belief over plausible targets.
Calibrated models guarantee that the predictions are neither over- nor under-confident.
arXiv Detail & Related papers (2022-10-21T09:49:57Z) - Parametric and Multivariate Uncertainty Calibration for Regression and
Object Detection [4.630093015127541]
We show that common detection models overestimate the spatial uncertainty in comparison to the observed error.
Our experiments show that the simple Isotonic Regression recalibration method is sufficient to achieve a good calibrated uncertainty.
In contrast, if normal distributions are required for subsequent processes, our GP-Normal recalibration method yields the best results.
arXiv Detail & Related papers (2022-07-04T08:00:20Z) - Unsupervised Calibration under Covariate Shift [92.02278658443166]
We introduce the problem of calibration under domain shift and propose an importance sampling based approach to address it.
We evaluate and discuss the efficacy of our method on both real-world datasets and synthetic datasets.
arXiv Detail & Related papers (2020-06-29T21:50:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.