Flexible Model Aggregation for Quantile Regression
- URL: http://arxiv.org/abs/2103.00083v5
- Date: Sat, 15 Apr 2023 08:40:57 GMT
- Title: Flexible Model Aggregation for Quantile Regression
- Authors: Rasool Fakoor, Taesup Kim, Jonas Mueller, Alexander J. Smola, Ryan J.
Tibshirani
- Abstract summary: Quantile regression is a fundamental problem in statistical learning motivated by a need to quantify uncertainty in predictions.
We investigate methods for aggregating any number of conditional quantile models.
All of the models we consider in this paper can be fit using modern deep learning toolkits.
- Score: 92.63075261170302
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantile regression is a fundamental problem in statistical learning
motivated by a need to quantify uncertainty in predictions, or to model a
diverse population without being overly reductive. For instance,
epidemiological forecasts, cost estimates, and revenue predictions all benefit
from being able to quantify the range of possible values accurately. As such,
many models have been developed for this problem over many years of research in
statistics, machine learning, and related fields. Rather than proposing yet
another (new) algorithm for quantile regression we adopt a meta viewpoint: we
investigate methods for aggregating any number of conditional quantile models,
in order to improve accuracy and robustness. We consider weighted ensembles
where weights may vary over not only individual models, but also over quantile
levels, and feature values. All of the models we consider in this paper can be
fit using modern deep learning toolkits, and hence are widely accessible (from
an implementation point of view) and scalable. To improve the accuracy of the
predicted quantiles (or equivalently, prediction intervals), we develop tools
for ensuring that quantiles remain monotonically ordered, and apply conformal
calibration methods. These can be used without any modification of the original
library of base models. We also review some basic theory surrounding quantile
aggregation and related scoring rules, and contribute a few new results to this
literature (for example, the fact that post sorting or post isotonic regression
can only improve the weighted interval score). Finally, we provide an extensive
suite of empirical comparisons across 34 data sets from two different benchmark
repositories.
Related papers
- Quantile deep learning models for multi-step ahead time series prediction [0.15833270109954137]
We present a novel quantile regression deep learning framework for multi-step time series prediction.
We provide an implementation of deep learning models for multi-step ahead time series prediction.
We evaluate their performance under high volatility and extreme conditions.
arXiv Detail & Related papers (2024-11-24T00:00:10Z) - Semiparametric conformal prediction [79.6147286161434]
Risk-sensitive applications require well-calibrated prediction sets over multiple, potentially correlated target variables.
We treat the scores as random vectors and aim to construct the prediction set accounting for their joint correlation structure.
We report desired coverage and competitive efficiency on a range of real-world regression problems.
arXiv Detail & Related papers (2024-11-04T14:29:02Z) - Uncertainty estimation in satellite precipitation spatial prediction by combining distributional regression algorithms [3.8623569699070353]
We introduce the concept of distributional regression for the engineering task of creating precipitation datasets through data merging.
We propose new ensemble learning methods that can be valuable not only for spatial prediction but also for prediction problems in general.
arXiv Detail & Related papers (2024-06-29T05:58:00Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Bayesian Quantile Regression with Subset Selection: A Decision Analysis Perspective [0.0]
Quantile regression is a powerful tool for inferring how covariates affect specific percentiles of the response distribution.
Existing methods estimate conditional quantiles separately for each quantile of interest or estimate the entire conditional distribution using semi- or non-parametric models.
We pose the fundamental problems of linear quantile estimation, uncertainty quantification, and subset selection from a Bayesian decision analysis perspective.
arXiv Detail & Related papers (2023-11-03T17:19:31Z) - Engression: Extrapolation through the Lens of Distributional Regression [2.519266955671697]
We propose a neural network-based distributional regression methodology called engression'
An engression model is generative in the sense that we can sample from the fitted conditional distribution and is also suitable for high-dimensional outcomes.
We show that engression can successfully perform extrapolation under some assumptions such as monotonicity, whereas traditional regression approaches such as least-squares or quantile regression fall short under the same assumptions.
arXiv Detail & Related papers (2023-07-03T08:19:00Z) - X-model: Improving Data Efficiency in Deep Learning with A Minimax Model [78.55482897452417]
We aim at improving data efficiency for both classification and regression setups in deep learning.
To take the power of both worlds, we propose a novel X-model.
X-model plays a minimax game between the feature extractor and task-specific heads.
arXiv Detail & Related papers (2021-10-09T13:56:48Z) - Understanding the Under-Coverage Bias in Uncertainty Estimation [58.03725169462616]
quantile regression tends to emphunder-cover than the desired coverage level in reality.
We prove that quantile regression suffers from an inherent under-coverage bias.
Our theory reveals that this under-coverage bias stems from a certain high-dimensional parameter estimation error.
arXiv Detail & Related papers (2021-06-10T06:11:55Z) - Regression Bugs Are In Your Model! Measuring, Reducing and Analyzing
Regressions In NLP Model Updates [68.09049111171862]
This work focuses on quantifying, reducing and analyzing regression errors in the NLP model updates.
We formulate the regression-free model updates into a constrained optimization problem.
We empirically analyze how model ensemble reduces regression.
arXiv Detail & Related papers (2021-05-07T03:33:00Z) - Regularization Strategies for Quantile Regression [8.232258589877942]
We show that minimizing an expected pinball loss over a continuous distribution of quantiles is a good regularizer even when only predicting a specific quantile.
We show that lattice models enable regularizing the predicted distribution to a location-scale family.
arXiv Detail & Related papers (2021-02-09T21:10:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.