Loss-Controlling Calibration for Predictive Models
- URL: http://arxiv.org/abs/2301.04378v3
- Date: Tue, 23 Jan 2024 01:56:09 GMT
- Title: Loss-Controlling Calibration for Predictive Models
- Authors: Di Wang, Junzhi Shi, Pingping Wang, Shuo Zhuang, Hongyue Li
- Abstract summary: We propose a learning framework for calibrating predictive models to make loss-controlling prediction for exchangeable data.
By comparison, the predictors built by the proposed loss-controlling approach are not limited to set predictors.
Our proposed method is applied to selective regression and high-impact weather forecasting problems.
- Score: 5.51361762392299
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a learning framework for calibrating predictive models to make
loss-controlling prediction for exchangeable data, which extends our recently
proposed conformal loss-controlling prediction for more general cases. By
comparison, the predictors built by the proposed loss-controlling approach are
not limited to set predictors, and the loss function can be any measurable
function without the monotone assumption. To control the loss values in an
efficient way, we introduce transformations preserving exchangeability to prove
finite-sample controlling guarantee when the test label is obtained, and then
develop an approximation approach to construct predictors. The transformations
can be built on any predefined function, which include using optimization
algorithms for parameter searching. This approach is a natural extension of
conformal loss-controlling prediction, since it can be reduced to the latter
when the set predictors have the nesting property and the loss functions are
monotone. Our proposed method is applied to selective regression and
high-impact weather forecasting problems, which demonstrates its effectiveness
for general loss-controlling prediction.
Related papers
- Conformal Generative Modeling with Improved Sample Efficiency through Sequential Greedy Filtering [55.15192437680943]
Generative models lack rigorous statistical guarantees for their outputs.
We propose a sequential conformal prediction method producing prediction sets that satisfy a rigorous statistical guarantee.
This guarantee states that with high probability, the prediction sets contain at least one admissible (or valid) example.
arXiv Detail & Related papers (2024-10-02T15:26:52Z) - Calibrated Probabilistic Forecasts for Arbitrary Sequences [58.54729945445505]
Real-world data streams can change unpredictably due to distribution shifts, feedback loops and adversarial actors.
We present a forecasting framework ensuring valid uncertainty estimates regardless of how data evolves.
arXiv Detail & Related papers (2024-09-27T21:46:42Z) - Conformal Prediction with Missing Values [19.18178194789968]
We first show that the marginal coverage guarantee of conformal prediction holds on imputed data for any missingness distribution.
We then show that a universally consistent quantile regression algorithm trained on the imputed data is Bayes optimal for the pinball risk.
arXiv Detail & Related papers (2023-06-05T09:28:03Z) - Variational Inference with Coverage Guarantees in Simulation-Based Inference [18.818573945984873]
We propose Conformalized Amortized Neural Variational Inference (CANVI)
CANVI constructs conformalized predictors based on each candidate, compares the predictors using a metric known as predictive efficiency, and returns the most efficient predictor.
We prove lower bounds on the predictive efficiency of the regions produced by CANVI and explore how the quality of a posterior approximation relates to the predictive efficiency of prediction regions based on that approximation.
arXiv Detail & Related papers (2023-05-23T17:24:04Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Conformal Loss-Controlling Prediction [23.218535051437588]
Conformal prediction is a learning framework controlling prediction coverage of prediction sets.
This work proposes a learning framework named conformal loss-controlling prediction, which extends conformal prediction to the situation where the value of a loss function needs to be controlled.
arXiv Detail & Related papers (2023-01-06T08:58:49Z) - Efficient and Differentiable Conformal Prediction with General Function
Classes [96.74055810115456]
We propose a generalization of conformal prediction to multiple learnable parameters.
We show that it achieves approximate valid population coverage and near-optimal efficiency within class.
Experiments show that our algorithm is able to learn valid prediction sets and improve the efficiency significantly.
arXiv Detail & Related papers (2022-02-22T18:37:23Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z) - Private Prediction Sets [72.75711776601973]
Machine learning systems need reliable uncertainty quantification and protection of individuals' privacy.
We present a framework that treats these two desiderata jointly.
We evaluate the method on large-scale computer vision datasets.
arXiv Detail & Related papers (2021-02-11T18:59:11Z) - Short-term prediction of Time Series based on bounding techniques [0.0]
This paper is reconsidered the prediction problem in time series framework by using a new non-parametric approach.
The innovation is to consider both deterministic and deterministic-stochastic assumptions in order to obtain the upper bound of the prediction error.
A benchmark is included to illustrate that the proposed predictor can obtain suitable results in a prediction scheme, and can be an interesting alternative method to the classical non-parametric methods.
arXiv Detail & Related papers (2021-01-26T11:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.