AutoCP: Automated Pipelines for Accurate Prediction Intervals
- URL: http://arxiv.org/abs/2006.14099v2
- Date: Sun, 13 Sep 2020 15:29:34 GMT
- Title: AutoCP: Automated Pipelines for Accurate Prediction Intervals
- Authors: Yao Zhang and William Zame and Mihaela van der Schaar
- Abstract summary: This paper proposes an AutoML framework called Automatic Machine Learning for Conformal Prediction (AutoCP)
Unlike the familiar AutoML frameworks that attempt to select the best prediction model, AutoCP constructs prediction intervals that achieve the user-specified target coverage rate.
We tested AutoCP on a variety of datasets and found that it significantly outperforms benchmark algorithms.
- Score: 84.16181066107984
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Successful application of machine learning models to real-world prediction
problems, e.g. financial forecasting and personalized medicine, has proved to
be challenging, because such settings require limiting and quantifying the
uncertainty in the model predictions, i.e. providing valid and accurate
prediction intervals. Conformal Prediction is a distribution-free approach to
construct valid prediction intervals in finite samples. However, the prediction
intervals constructed by Conformal Prediction are often (because of
over-fitting, inappropriate measures of nonconformity, or other issues) overly
conservative and hence inadequate for the application(s) at hand. This paper
proposes an AutoML framework called Automatic Machine Learning for Conformal
Prediction (AutoCP). Unlike the familiar AutoML frameworks that attempt to
select the best prediction model, AutoCP constructs prediction intervals that
achieve the user-specified target coverage rate while optimizing the interval
length to be accurate and less conservative. We tested AutoCP on a variety of
datasets and found that it significantly outperforms benchmark algorithms.
Related papers
- Multi-model Ensemble Conformal Prediction in Dynamic Environments [14.188004615463742]
We introduce a novel adaptive conformal prediction framework, where the model used for creating prediction sets is selected on the fly from multiple candidate models.
The proposed algorithm is proven to achieve strongly adaptive regret over all intervals while maintaining valid coverage.
arXiv Detail & Related papers (2024-11-06T05:57:28Z) - Normalizing Flows for Conformal Regression [0.0]
Conformal Prediction (CP) algorithms estimate the uncertainty of a prediction model by calibrating its outputs on labeled data.
We present a general scheme to localize the intervals by training the calibration process.
Unlike the Error Reweighting CP algorithm of Papadopoulos et al. (2008), the framework allows estimating the gap between nominal and empirical conditional validity.
arXiv Detail & Related papers (2024-06-05T15:04:28Z) - Conformal online model aggregation [29.43493007296859]
This paper proposes a new approach towards conformal model aggregation in online settings.
It is based on combining the prediction sets from several algorithms by voting, where weights on the models are adapted over time based on past performance.
arXiv Detail & Related papers (2024-03-22T15:40:06Z) - Conformal Prediction for Deep Classifier via Label Ranking [29.784336674173616]
Conformal prediction is a statistical framework that generates prediction sets with a desired coverage guarantee.
We propose a novel algorithm named $textitSorted Adaptive Prediction Sets$ (SAPS)
SAPS discards all the probability values except for the maximum softmax probability.
arXiv Detail & Related papers (2023-10-10T08:54:14Z) - Uncertainty Quantification over Graph with Conformalized Graph Neural
Networks [52.20904874696597]
Graph Neural Networks (GNNs) are powerful machine learning prediction models on graph-structured data.
GNNs lack rigorous uncertainty estimates, limiting their reliable deployment in settings where the cost of errors is significant.
We propose conformalized GNN (CF-GNN), extending conformal prediction (CP) to graph-based models for guaranteed uncertainty estimates.
arXiv Detail & Related papers (2023-05-23T21:38:23Z) - Conformal Prediction Regions for Time Series using Linear
Complementarity Programming [25.094249285804224]
We propose an optimization-based method for reducing conservatism to enable long horizon planning and verification.
We show that this problem can be cast as a mixed integer linear complementarity program (MILCP), which we then relax into a linear complementarity program (LCP)
arXiv Detail & Related papers (2023-04-03T15:32:38Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - Efficient and Differentiable Conformal Prediction with General Function
Classes [96.74055810115456]
We propose a generalization of conformal prediction to multiple learnable parameters.
We show that it achieves approximate valid population coverage and near-optimal efficiency within class.
Experiments show that our algorithm is able to learn valid prediction sets and improve the efficiency significantly.
arXiv Detail & Related papers (2022-02-22T18:37:23Z) - Private Prediction Sets [72.75711776601973]
Machine learning systems need reliable uncertainty quantification and protection of individuals' privacy.
We present a framework that treats these two desiderata jointly.
We evaluate the method on large-scale computer vision datasets.
arXiv Detail & Related papers (2021-02-11T18:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.