Deep Non-Crossing Quantiles through the Partial Derivative
- URL: http://arxiv.org/abs/2201.12848v1
- Date: Sun, 30 Jan 2022 15:35:21 GMT
- Title: Deep Non-Crossing Quantiles through the Partial Derivative
- Authors: Axel Brando, Joan Gimeno, Jose A. Rodr\'iguez-Serrano, Jordi Vitri\`a
- Abstract summary: Quantile Regression provides a way to approximate a single conditional quantile.
Minimisation of the QR-loss function does not guarantee non-crossing quantiles.
We propose a generic deep learning algorithm for predicting an arbitrary number of quantiles.
- Score: 0.6299766708197883
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Quantile Regression (QR) provides a way to approximate a single conditional
quantile. To have a more informative description of the conditional
distribution, QR can be merged with deep learning techniques to simultaneously
estimate multiple quantiles. However, the minimisation of the QR-loss function
does not guarantee non-crossing quantiles, which affects the validity of such
predictions and introduces a critical issue in certain scenarios. In this
article, we propose a generic deep learning algorithm for predicting an
arbitrary number of quantiles that ensures the quantile monotonicity constraint
up to the machine precision and maintains its modelling performance with
respect to alternative models. The presented method is evaluated over several
real-world datasets obtaining state-of-the-art results as well as showing that
it scales to large-size data sets.
Related papers
- Semiparametric conformal prediction [79.6147286161434]
Risk-sensitive applications require well-calibrated prediction sets over multiple, potentially correlated target variables.
We treat the scores as random vectors and aim to construct the prediction set accounting for their joint correlation structure.
We report desired coverage and competitive efficiency on a range of real-world regression problems.
arXiv Detail & Related papers (2024-11-04T14:29:02Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Bayesian Quantile Regression with Subset Selection: A Decision Analysis Perspective [0.0]
Quantile regression is a powerful tool for inferring how covariates affect specific percentiles of the response distribution.
Existing methods estimate conditional quantiles separately for each quantile of interest or estimate the entire conditional distribution using semi- or non-parametric models.
We pose the fundamental problems of linear quantile estimation, uncertainty quantification, and subset selection from a Bayesian decision analysis perspective.
arXiv Detail & Related papers (2023-11-03T17:19:31Z) - Neural Spline Search for Quantile Probabilistic Modeling [35.914279831992964]
We propose a non-parametric and data-driven approach, Neural Spline Search (NSS), to represent the observed data distribution without parametric assumptions.
We demonstrate that NSS outperforms previous methods on synthetic, real-world regression and time-series forecasting tasks.
arXiv Detail & Related papers (2023-01-12T07:45:28Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Faster One-Sample Stochastic Conditional Gradient Method for Composite
Convex Minimization [61.26619639722804]
We propose a conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms.
The proposed method, equipped with an average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques.
arXiv Detail & Related papers (2022-02-26T19:10:48Z) - Learning Quantile Functions without Quantile Crossing for
Distribution-free Time Series Forecasting [12.269597033369557]
We propose the Incremental (Spline) Quantile Functions I(S)QF, a flexible and efficient distribution-free quantile estimation framework.
We also provide a generalization error analysis of our proposed approaches under the sequence-to-sequence setting.
arXiv Detail & Related papers (2021-11-12T06:54:48Z) - Understanding the Under-Coverage Bias in Uncertainty Estimation [58.03725169462616]
quantile regression tends to emphunder-cover than the desired coverage level in reality.
We prove that quantile regression suffers from an inherent under-coverage bias.
Our theory reveals that this under-coverage bias stems from a certain high-dimensional parameter estimation error.
arXiv Detail & Related papers (2021-06-10T06:11:55Z) - Flexible Model Aggregation for Quantile Regression [92.63075261170302]
Quantile regression is a fundamental problem in statistical learning motivated by a need to quantify uncertainty in predictions.
We investigate methods for aggregating any number of conditional quantile models.
All of the models we consider in this paper can be fit using modern deep learning toolkits.
arXiv Detail & Related papers (2021-02-26T23:21:16Z) - Regularization Strategies for Quantile Regression [8.232258589877942]
We show that minimizing an expected pinball loss over a continuous distribution of quantiles is a good regularizer even when only predicting a specific quantile.
We show that lattice models enable regularizing the predicted distribution to a location-scale family.
arXiv Detail & Related papers (2021-02-09T21:10:35Z) - Beyond Pinball Loss: Quantile Methods for Calibrated Uncertainty
Quantification [15.94100899123465]
A model that predicts the true conditional quantiles for each input, at all quantile levels, presents a correct and efficient representation of the underlying uncertainty.
Current quantile-based methods focus on optimizing the so-called pinball loss.
We develop new quantile methods that address these shortcomings.
arXiv Detail & Related papers (2020-11-18T23:51:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.