Fast Nonlinear Vector Quantile Regression
- URL: http://arxiv.org/abs/2205.14977v3
- Date: Fri, 2 Jun 2023 13:04:32 GMT
- Title: Fast Nonlinear Vector Quantile Regression
- Authors: Aviv A. Rosenberg, Sanketh Vedula, Yaniv Romano, Alex M. Bronstein
- Abstract summary: Quantile regression (QR) is a powerful tool for estimating one or more conditional quantiles of a target variable.
Vector quantile regression (VQR) was proposed as an extension of QR for vector-valued target variables.
- Score: 13.606557840299036
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantile regression (QR) is a powerful tool for estimating one or more
conditional quantiles of a target variable $\mathrm{Y}$ given explanatory
features $\boldsymbol{\mathrm{X}}$. A limitation of QR is that it is only
defined for scalar target variables, due to the formulation of its objective
function, and since the notion of quantiles has no standard definition for
multivariate distributions. Recently, vector quantile regression (VQR) was
proposed as an extension of QR for vector-valued target variables, thanks to a
meaningful generalization of the notion of quantiles to multivariate
distributions via optimal transport. Despite its elegance, VQR is arguably not
applicable in practice due to several limitations: (i) it assumes a linear
model for the quantiles of the target $\boldsymbol{\mathrm{Y}}$ given the
features $\boldsymbol{\mathrm{X}}$; (ii) its exact formulation is intractable
even for modestly-sized problems in terms of target dimensions, number of
regressed quantile levels, or number of features, and its relaxed dual
formulation may violate the monotonicity of the estimated quantiles; (iii) no
fast or scalable solvers for VQR currently exist. In this work we fully address
these limitations, namely: (i) We extend VQR to the non-linear case, showing
substantial improvement over linear VQR; (ii) We propose {vector monotone
rearrangement}, a method which ensures the quantile functions estimated by VQR
are monotone functions; (iii) We provide fast, GPU-accelerated solvers for
linear and nonlinear VQR which maintain a fixed memory footprint, and
demonstrate that they scale to millions of samples and thousands of quantile
levels; (iv) We release an optimized python package of our solvers as to
widespread the use of VQR in real-world applications.
Related papers
- Refined Risk Bounds for Unbounded Losses via Transductive Priors [58.967816314671296]
We revisit the sequential variants of linear regression with the squared loss, classification problems with hinge loss, and logistic regression.
Our key tools are based on the exponential weights algorithm with carefully chosen transductive priors.
arXiv Detail & Related papers (2024-10-29T00:01:04Z) - fastkqr: A Fast Algorithm for Kernel Quantile Regression [6.850636409964172]
We introduce fastkqr, which significantly advances the computation of quantile regression in reproducing kernel Hilbert spaces.
The core of fastkqr is a finite smoothing algorithm that magically produces exact regression quantiles, rather than approximations.
In addition, we extend fastkqr to accommodate a flexible kernel quantile regression with a data-driven crossing penalty.
arXiv Detail & Related papers (2024-08-10T00:18:56Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - HyperVQ: MLR-based Vector Quantization in Hyperbolic Space [56.4245885674567]
We study the use of hyperbolic spaces for vector quantization (HyperVQ)
We show that hyperVQ performs comparably in reconstruction and generative tasks while outperforming VQ in discriminative tasks and learning a highly disentangled latent space.
arXiv Detail & Related papers (2024-03-18T03:17:08Z) - Vector Quantile Regression on Manifolds [8.328891187733841]
Quantile regression (QR) is a statistical tool for distribution-free estimation of conditional quantiles of a target variable given explanatory features.
By leveraging optimal transport theory and c-concave functions, we meaningfully define conditional vector quantile functions of high-dimensional variables.
We demonstrate the approach's efficacy and provide insights regarding the meaning of non-Euclidean quantiles through synthetic and real data experiments.
arXiv Detail & Related papers (2023-07-03T14:17:12Z) - Deep Non-Crossing Quantiles through the Partial Derivative [0.6299766708197883]
Quantile Regression provides a way to approximate a single conditional quantile.
Minimisation of the QR-loss function does not guarantee non-crossing quantiles.
We propose a generic deep learning algorithm for predicting an arbitrary number of quantiles.
arXiv Detail & Related papers (2022-01-30T15:35:21Z) - Understanding the Under-Coverage Bias in Uncertainty Estimation [58.03725169462616]
quantile regression tends to emphunder-cover than the desired coverage level in reality.
We prove that quantile regression suffers from an inherent under-coverage bias.
Our theory reveals that this under-coverage bias stems from a certain high-dimensional parameter estimation error.
arXiv Detail & Related papers (2021-06-10T06:11:55Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - Regularization Strategies for Quantile Regression [8.232258589877942]
We show that minimizing an expected pinball loss over a continuous distribution of quantiles is a good regularizer even when only predicting a specific quantile.
We show that lattice models enable regularizing the predicted distribution to a location-scale family.
arXiv Detail & Related papers (2021-02-09T21:10:35Z) - Multi-target regression via output space quantization [0.3007949058551534]
The proposed method, called MRQ, is based on the idea of quantizing the output space in order to transform the multiple continuous targets into one or more discrete ones.
Experiments on a large collection of benchmark datasets show that MRQ is both highly scalable and also competitive with the state-of-the-art in terms of accuracy.
arXiv Detail & Related papers (2020-03-22T13:57:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.