Regularization Strategies for Quantile Regression
- URL: http://arxiv.org/abs/2102.05135v1
- Date: Tue, 9 Feb 2021 21:10:35 GMT
- Title: Regularization Strategies for Quantile Regression
- Authors: Taman Narayan, Serena Wang, Kevin Canini, Maya Gupta
- Abstract summary: We show that minimizing an expected pinball loss over a continuous distribution of quantiles is a good regularizer even when only predicting a specific quantile.
We show that lattice models enable regularizing the predicted distribution to a location-scale family.
- Score: 8.232258589877942
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We investigate different methods for regularizing quantile regression when
predicting either a subset of quantiles or the full inverse CDF. We show that
minimizing an expected pinball loss over a continuous distribution of quantiles
is a good regularizer even when only predicting a specific quantile. For
predicting multiple quantiles, we propose achieving the classic goal of
non-crossing quantiles by using deep lattice networks that treat the quantile
as a monotonic input feature, and we discuss why monotonicity on other features
is an apt regularizer for quantile regression. We show that lattice models
enable regularizing the predicted distribution to a location-scale family.
Lastly, we propose applying rate constraints to improve the calibration of the
quantile predictions on specific subsets of interest and improve fairness
metrics. We demonstrate our contributions on simulations, benchmark datasets,
and real quantile regression problems.
Related papers
- Semiparametric conformal prediction [79.6147286161434]
Risk-sensitive applications require well-calibrated prediction sets over multiple, potentially correlated target variables.
We treat the scores as random vectors and aim to construct the prediction set accounting for their joint correlation structure.
We report desired coverage and competitive efficiency on a range of real-world regression problems.
arXiv Detail & Related papers (2024-11-04T14:29:02Z) - Quantile Regression using Random Forest Proximities [0.9423257767158634]
Quantile regression forests estimate the entire conditional distribution of the target variable with a single model.
We show that using quantile regression using Random Forest proximities demonstrates superior performance in approximating conditional target distributions and prediction intervals to the original version of QRF.
arXiv Detail & Related papers (2024-08-05T10:02:33Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Deep Non-Crossing Quantiles through the Partial Derivative [0.6299766708197883]
Quantile Regression provides a way to approximate a single conditional quantile.
Minimisation of the QR-loss function does not guarantee non-crossing quantiles.
We propose a generic deep learning algorithm for predicting an arbitrary number of quantiles.
arXiv Detail & Related papers (2022-01-30T15:35:21Z) - Learning Quantile Functions without Quantile Crossing for
Distribution-free Time Series Forecasting [12.269597033369557]
We propose the Incremental (Spline) Quantile Functions I(S)QF, a flexible and efficient distribution-free quantile estimation framework.
We also provide a generalization error analysis of our proposed approaches under the sequence-to-sequence setting.
arXiv Detail & Related papers (2021-11-12T06:54:48Z) - Cluster-Promoting Quantization with Bit-Drop for Minimizing Network
Quantization Loss [61.26793005355441]
Cluster-Promoting Quantization (CPQ) finds the optimal quantization grids for neural networks.
DropBits is a new bit-drop technique that revises the standard dropout regularization to randomly drop bits instead of neurons.
We experimentally validate our method on various benchmark datasets and network architectures.
arXiv Detail & Related papers (2021-09-05T15:15:07Z) - Understanding the Under-Coverage Bias in Uncertainty Estimation [58.03725169462616]
quantile regression tends to emphunder-cover than the desired coverage level in reality.
We prove that quantile regression suffers from an inherent under-coverage bias.
Our theory reveals that this under-coverage bias stems from a certain high-dimensional parameter estimation error.
arXiv Detail & Related papers (2021-06-10T06:11:55Z) - Flexible Model Aggregation for Quantile Regression [92.63075261170302]
Quantile regression is a fundamental problem in statistical learning motivated by a need to quantify uncertainty in predictions.
We investigate methods for aggregating any number of conditional quantile models.
All of the models we consider in this paper can be fit using modern deep learning toolkits.
arXiv Detail & Related papers (2021-02-26T23:21:16Z) - Censored Quantile Regression Forest [81.9098291337097]
We develop a new estimating equation that adapts to censoring and leads to quantile score whenever the data do not exhibit censoring.
The proposed procedure named it censored quantile regression forest, allows us to estimate quantiles of time-to-event without any parametric modeling assumption.
arXiv Detail & Related papers (2020-01-08T23:20:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.