Regression with reject option and application to kNN
- URL: http://arxiv.org/abs/2006.16597v2
- Date: Fri, 5 Mar 2021 10:06:36 GMT
- Title: Regression with reject option and application to kNN
- Authors: Christophe Denis (LAMA), Mohamed Hebiri (LAMA), Ahmed Zaoui (LAMA)
- Abstract summary: We refer to this framework as regression with reject option as an extension of classification with reject option.
We provide a semi-supervised estimation procedure of the optimal rule involving two datasets.
The resulting predictor with reject option is shown to be almost as good as the optimal predictor with reject option both in terms of risk and rejection rate.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We investigate the problem of regression where one is allowed to abstain from
predicting. We refer to this framework as regression with reject option as an
extension of classification with reject option. In this context, we focus on
the case where the rejection rate is fixed and derive the optimal rule which
relies on thresholding the conditional variance function. We provide a
semi-supervised estimation procedure of the optimal rule involving two
datasets: a first labeled dataset is used to estimate both regression function
and conditional variance function while a second unlabeled dataset is exploited
to calibrate the desired rejection rate. The resulting predictor with reject
option is shown to be almost as good as the optimal predictor with reject
option both in terms of risk and rejection rate. We additionally apply our
methodology with kNN algorithm and establish rates of convergence for the
resulting kNN predictor under mild conditions. Finally, a numerical study is
performed to illustrate the benefit of using the proposed procedure.
Related papers
- Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Rejection via Learning Density Ratios [50.91522897152437]
Classification with rejection emerges as a learning paradigm which allows models to abstain from making predictions.
We propose a different distributional perspective, where we seek to find an idealized data distribution which maximizes a pretrained model's performance.
Our framework is tested empirically over clean and noisy datasets.
arXiv Detail & Related papers (2024-05-29T01:32:17Z) - Regression with Cost-based Rejection [30.43900105405108]
We investigate a novel regression problem where the model can reject to make predictions on some examples given certain rejection costs.
We derive the Bayes optimal solution, which shows that the optimal model should reject to make predictions on the examples whose variance is larger than the rejection cost.
arXiv Detail & Related papers (2023-11-08T09:33:21Z) - Calibrating Neural Simulation-Based Inference with Differentiable
Coverage Probability [50.44439018155837]
We propose to include a calibration term directly into the training objective of the neural model.
By introducing a relaxation of the classical formulation of calibration error we enable end-to-end backpropagation.
It is directly applicable to existing computational pipelines allowing reliable black-box posterior inference.
arXiv Detail & Related papers (2023-10-20T10:20:45Z) - Learning When to Say "I Don't Know" [0.5505634045241288]
We propose a new Reject Option Classification technique to identify and remove regions of uncertainty in the decision space.
We consider an alternative formulation by instead analyzing the complementary reject region and employing a validation set to learn per-class softmax thresholds.
We provide results showing the benefits of the proposed method over na"ively thresholding/uncalibrated softmax scores with 2-D points, imagery, and text classification datasets.
arXiv Detail & Related papers (2022-09-11T21:50:03Z) - Near-optimal inference in adaptive linear regression [60.08422051718195]
Even simple methods like least squares can exhibit non-normal behavior when data is collected in an adaptive manner.
We propose a family of online debiasing estimators to correct these distributional anomalies in at least squares estimation.
We demonstrate the usefulness of our theory via applications to multi-armed bandit, autoregressive time series estimation, and active learning with exploration.
arXiv Detail & Related papers (2021-07-05T21:05:11Z) - Universal Off-Policy Evaluation [64.02853483874334]
We take the first steps towards a universal off-policy estimator (UnO)
We use UnO for estimating and simultaneously bounding the mean, variance, quantiles/median, inter-quantile range, CVaR, and the entire cumulative distribution of returns.
arXiv Detail & Related papers (2021-04-26T18:54:31Z) - Learning Probabilistic Ordinal Embeddings for Uncertainty-Aware
Regression [91.3373131262391]
Uncertainty is the only certainty there is.
Traditionally, the direct regression formulation is considered and the uncertainty is modeled by modifying the output space to a certain family of probabilistic distributions.
How to model the uncertainty within the present-day technologies for regression remains an open issue.
arXiv Detail & Related papers (2021-03-25T06:56:09Z) - Estimation and Applications of Quantiles in Deep Binary Classification [0.0]
Quantile regression, based on check loss, is a widely used inferential paradigm in Statistics.
We consider the analogue of check loss in the binary classification setting.
We develop individualized confidence scores that can be used to decide whether a prediction is reliable.
arXiv Detail & Related papers (2021-02-09T07:07:42Z) - Ridge Regression Revisited: Debiasing, Thresholding and Bootstrap [4.142720557665472]
ridge regression may be worth another look since -- after debiasing and thresholding -- it may offer some advantages over the Lasso.
In this paper, we define a debiased and thresholded ridge regression method, and prove a consistency result and a Gaussian approximation theorem.
In addition to estimation, we consider the problem of prediction, and present a novel, hybrid bootstrap algorithm tailored for prediction intervals.
arXiv Detail & Related papers (2020-09-17T05:04:10Z) - On Low-rank Trace Regression under General Sampling Distribution [9.699586426043885]
We show that cross-validated estimators satisfy near-optimal error bounds on general assumptions.
We also show that the cross-validated estimator outperforms the theory-inspired approach of selecting the parameter.
arXiv Detail & Related papers (2019-04-18T02:56:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.