Variable-Based Calibration for Machine Learning Classifiers
- URL: http://arxiv.org/abs/2209.15154v3
- Date: Wed, 5 Apr 2023 21:15:41 GMT
- Title: Variable-Based Calibration for Machine Learning Classifiers
- Authors: Markelle Kelly and Padhraic Smyth
- Abstract summary: We introduce the notion of variable-based calibration to characterize calibration properties of a model.
We find that models with near-perfect expected calibration error can exhibit significant miscalibration as a function of features of the data.
- Score: 11.9995808096481
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The deployment of machine learning classifiers in high-stakes domains
requires well-calibrated confidence scores for model predictions. In this paper
we introduce the notion of variable-based calibration to characterize
calibration properties of a model with respect to a variable of interest,
generalizing traditional score-based metrics such as expected calibration error
(ECE). In particular, we find that models with near-perfect ECE can exhibit
significant miscalibration as a function of features of the data. We
demonstrate this phenomenon both theoretically and in practice on multiple
well-known datasets, and show that it can persist after the application of
existing calibration methods. To mitigate this issue, we propose strategies for
detection, visualization, and quantification of variable-based calibration
error. We then examine the limitations of current score-based calibration
methods and explore potential modifications. Finally, we discuss the
implications of these findings, emphasizing that an understanding of
calibration beyond simple aggregate measures is crucial for endeavors such as
fairness and model interpretability.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.