Learning Quantile Functions without Quantile Crossing for
Distribution-free Time Series Forecasting
- URL: http://arxiv.org/abs/2111.06581v1
- Date: Fri, 12 Nov 2021 06:54:48 GMT
- Title: Learning Quantile Functions without Quantile Crossing for
Distribution-free Time Series Forecasting
- Authors: Youngsuk Park, Danielle Maddix, Fran\c{c}ois-Xavier Aubet, Kelvin Kan,
Jan Gasthaus, Yuyang Wang
- Abstract summary: We propose the Incremental (Spline) Quantile Functions I(S)QF, a flexible and efficient distribution-free quantile estimation framework.
We also provide a generalization error analysis of our proposed approaches under the sequence-to-sequence setting.
- Score: 12.269597033369557
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Quantile regression is an effective technique to quantify uncertainty, fit
challenging underlying distributions, and often provide full probabilistic
predictions through joint learnings over multiple quantile levels. A common
drawback of these joint quantile regressions, however, is \textit{quantile
crossing}, which violates the desirable monotone property of the conditional
quantile function. In this work, we propose the Incremental (Spline) Quantile
Functions I(S)QF, a flexible and efficient distribution-free quantile
estimation framework that resolves quantile crossing with a simple neural
network layer. Moreover, I(S)QF inter/extrapolate to predict arbitrary quantile
levels that differ from the underlying training ones. Equipped with the
analytical evaluation of the continuous ranked probability score of I(S)QF
representations, we apply our methods to NN-based times series forecasting
cases, where the savings of the expensive re-training costs for non-trained
quantile levels is particularly significant. We also provide a generalization
error analysis of our proposed approaches under the sequence-to-sequence
setting. Lastly, extensive experiments demonstrate the improvement of
consistency and accuracy errors over other baselines.
Related papers
- Semiparametric conformal prediction [79.6147286161434]
Risk-sensitive applications require well-calibrated prediction sets over multiple, potentially correlated target variables.
We treat the scores as random vectors and aim to construct the prediction set accounting for their joint correlation structure.
We report desired coverage and competitive efficiency on a range of real-world regression problems.
arXiv Detail & Related papers (2024-11-04T14:29:02Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Bayesian Quantile Regression with Subset Selection: A Posterior Summarization Perspective [0.0]
Quantile regression is a powerful tool in epidemiological studies where interest lies in inferring how different exposures affect specific percentiles of the distribution of a health or life outcome.
Existing methods either estimate conditional quantiles separately for each quantile of interest or estimate the entire conditional distribution using semi- or non-parametric models.
We pose the fundamental problems of linear quantile estimation, uncertainty quantification, and subset selection from a Bayesian decision analysis perspective.
Our approach introduces a quantile-focused squared error loss, which enables efficient, closed-form computing and maintains a close relationship with Wasserstein-based density estimation.
arXiv Detail & Related papers (2023-11-03T17:19:31Z) - Distribution-Flexible Subset Quantization for Post-Quantizing
Super-Resolution Networks [68.83451203841624]
This paper introduces Distribution-Flexible Subset Quantization (DFSQ), a post-training quantization method for super-resolution networks.
DFSQ conducts channel-wise normalization of the activations and applies distribution-flexible subset quantization (SQ)
It achieves comparable performance to full-precision counterparts on 6- and 8-bit quantization, and incurs only a 0.1 dB PSNR drop on 4-bit quantization.
arXiv Detail & Related papers (2023-05-10T04:19:11Z) - Neural Spline Search for Quantile Probabilistic Modeling [35.914279831992964]
We propose a non-parametric and data-driven approach, Neural Spline Search (NSS), to represent the observed data distribution without parametric assumptions.
We demonstrate that NSS outperforms previous methods on synthetic, real-world regression and time-series forecasting tasks.
arXiv Detail & Related papers (2023-01-12T07:45:28Z) - Deep Non-Crossing Quantiles through the Partial Derivative [0.6299766708197883]
Quantile Regression provides a way to approximate a single conditional quantile.
Minimisation of the QR-loss function does not guarantee non-crossing quantiles.
We propose a generic deep learning algorithm for predicting an arbitrary number of quantiles.
arXiv Detail & Related papers (2022-01-30T15:35:21Z) - Bias-Variance Tradeoffs in Single-Sample Binary Gradient Estimators [100.58924375509659]
Straight-through (ST) estimator gained popularity due to its simplicity and efficiency.
Several techniques were proposed to improve over ST while keeping the same low computational complexity.
We conduct a theoretical analysis of Bias and Variance of these methods in order to understand tradeoffs and verify originally claimed properties.
arXiv Detail & Related papers (2021-10-07T15:16:07Z) - Cluster-Promoting Quantization with Bit-Drop for Minimizing Network
Quantization Loss [61.26793005355441]
Cluster-Promoting Quantization (CPQ) finds the optimal quantization grids for neural networks.
DropBits is a new bit-drop technique that revises the standard dropout regularization to randomly drop bits instead of neurons.
We experimentally validate our method on various benchmark datasets and network architectures.
arXiv Detail & Related papers (2021-09-05T15:15:07Z) - Understanding the Under-Coverage Bias in Uncertainty Estimation [58.03725169462616]
quantile regression tends to emphunder-cover than the desired coverage level in reality.
We prove that quantile regression suffers from an inherent under-coverage bias.
Our theory reveals that this under-coverage bias stems from a certain high-dimensional parameter estimation error.
arXiv Detail & Related papers (2021-06-10T06:11:55Z) - Regularization Strategies for Quantile Regression [8.232258589877942]
We show that minimizing an expected pinball loss over a continuous distribution of quantiles is a good regularizer even when only predicting a specific quantile.
We show that lattice models enable regularizing the predicted distribution to a location-scale family.
arXiv Detail & Related papers (2021-02-09T21:10:35Z) - Beyond Pinball Loss: Quantile Methods for Calibrated Uncertainty
Quantification [15.94100899123465]
A model that predicts the true conditional quantiles for each input, at all quantile levels, presents a correct and efficient representation of the underlying uncertainty.
Current quantile-based methods focus on optimizing the so-called pinball loss.
We develop new quantile methods that address these shortcomings.
arXiv Detail & Related papers (2020-11-18T23:51:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.