Beyond Pinball Loss: Quantile Methods for Calibrated Uncertainty
Quantification
- URL: http://arxiv.org/abs/2011.09588v4
- Date: Thu, 9 Dec 2021 16:06:00 GMT
- Title: Beyond Pinball Loss: Quantile Methods for Calibrated Uncertainty
Quantification
- Authors: Youngseog Chung, Willie Neiswanger, Ian Char, Jeff Schneider
- Abstract summary: A model that predicts the true conditional quantiles for each input, at all quantile levels, presents a correct and efficient representation of the underlying uncertainty.
Current quantile-based methods focus on optimizing the so-called pinball loss.
We develop new quantile methods that address these shortcomings.
- Score: 15.94100899123465
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Among the many ways of quantifying uncertainty in a regression setting,
specifying the full quantile function is attractive, as quantiles are amenable
to interpretation and evaluation. A model that predicts the true conditional
quantiles for each input, at all quantile levels, presents a correct and
efficient representation of the underlying uncertainty. To achieve this, many
current quantile-based methods focus on optimizing the so-called pinball loss.
However, this loss restricts the scope of applicable regression models, limits
the ability to target many desirable properties (e.g. calibration, sharpness,
centered intervals), and may produce poor conditional quantiles. In this work,
we develop new quantile methods that address these shortcomings. In particular,
we propose methods that can apply to any class of regression model, allow for
selecting a trade-off between calibration and sharpness, optimize for
calibration of centered intervals, and produce more accurate conditional
quantiles. We provide a thorough experimental evaluation of our methods, which
includes a high dimensional uncertainty quantification task in nuclear fusion.
Related papers
- Conformalized High-Density Quantile Regression via Dynamic Prototypes-based Probability Density Estimation [2.526146573337397]
We introduce a conformalized high-density quantile regression approach with a dynamically adaptive set of prototypes.
Our method optimize the set of prototypes by adaptively adding, deleting, and relocating quantization bins.
Experiments across diverse datasets and dimensionalities confirm that our method consistently achieves high-quality prediction regions.
arXiv Detail & Related papers (2024-11-02T14:36:12Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Bayesian Quantile Regression with Subset Selection: A Posterior Summarization Perspective [0.0]
Quantile regression is a powerful tool in epidemiological studies where interest lies in inferring how different exposures affect specific percentiles of the distribution of a health or life outcome.
Existing methods either estimate conditional quantiles separately for each quantile of interest or estimate the entire conditional distribution using semi- or non-parametric models.
We pose the fundamental problems of linear quantile estimation, uncertainty quantification, and subset selection from a Bayesian decision analysis perspective.
Our approach introduces a quantile-focused squared error loss, which enables efficient, closed-form computing and maintains a close relationship with Wasserstein-based density estimation.
arXiv Detail & Related papers (2023-11-03T17:19:31Z) - QuantProb: Generalizing Probabilities along with Predictions for a Pre-trained Classifier [1.8488661947561271]
We argue that the reason for unreliability of deep networks is - The way neural networks are currently trained, the probabilities do not generalize across small distortions.
We propose an innovative approach to decouple the construction of quantile representations from the loss function allowing us to compute quantile based probabilities without disturbing the original network.
arXiv Detail & Related papers (2023-04-25T12:39:45Z) - Sharp Calibrated Gaussian Processes [58.94710279601622]
State-of-the-art approaches for designing calibrated models rely on inflating the Gaussian process posterior variance.
We present a calibration approach that generates predictive quantiles using a computation inspired by the vanilla Gaussian process posterior variance.
Our approach is shown to yield a calibrated model under reasonable assumptions.
arXiv Detail & Related papers (2023-02-23T12:17:36Z) - Deep Non-Crossing Quantiles through the Partial Derivative [0.6299766708197883]
Quantile Regression provides a way to approximate a single conditional quantile.
Minimisation of the QR-loss function does not guarantee non-crossing quantiles.
We propose a generic deep learning algorithm for predicting an arbitrary number of quantiles.
arXiv Detail & Related papers (2022-01-30T15:35:21Z) - Learning Quantile Functions without Quantile Crossing for
Distribution-free Time Series Forecasting [12.269597033369557]
We propose the Incremental (Spline) Quantile Functions I(S)QF, a flexible and efficient distribution-free quantile estimation framework.
We also provide a generalization error analysis of our proposed approaches under the sequence-to-sequence setting.
arXiv Detail & Related papers (2021-11-12T06:54:48Z) - Cluster-Promoting Quantization with Bit-Drop for Minimizing Network
Quantization Loss [61.26793005355441]
Cluster-Promoting Quantization (CPQ) finds the optimal quantization grids for neural networks.
DropBits is a new bit-drop technique that revises the standard dropout regularization to randomly drop bits instead of neurons.
We experimentally validate our method on various benchmark datasets and network architectures.
arXiv Detail & Related papers (2021-09-05T15:15:07Z) - Understanding the Under-Coverage Bias in Uncertainty Estimation [58.03725169462616]
quantile regression tends to emphunder-cover than the desired coverage level in reality.
We prove that quantile regression suffers from an inherent under-coverage bias.
Our theory reveals that this under-coverage bias stems from a certain high-dimensional parameter estimation error.
arXiv Detail & Related papers (2021-06-10T06:11:55Z) - Flexible Model Aggregation for Quantile Regression [92.63075261170302]
Quantile regression is a fundamental problem in statistical learning motivated by a need to quantify uncertainty in predictions.
We investigate methods for aggregating any number of conditional quantile models.
All of the models we consider in this paper can be fit using modern deep learning toolkits.
arXiv Detail & Related papers (2021-02-26T23:21:16Z) - Instability, Computational Efficiency and Statistical Accuracy [101.32305022521024]
We develop a framework that yields statistical accuracy based on interplay between the deterministic convergence rate of the algorithm at the population level, and its degree of (instability) when applied to an empirical object based on $n$ samples.
We provide applications of our general results to several concrete classes of models, including Gaussian mixture estimation, non-linear regression models, and informative non-response models.
arXiv Detail & Related papers (2020-05-22T22:30:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.