Training-Conditional Coverage Bounds under Covariate Shift
- URL: http://arxiv.org/abs/2405.16594v1
- Date: Sun, 26 May 2024 15:07:16 GMT
- Title: Training-Conditional Coverage Bounds under Covariate Shift
- Authors: Mehrdad Pournaderi, Yu Xiang,
- Abstract summary: We study the training-conditional coverage properties of a range of conformal prediction methods.
Results for the split conformal method are almost assumption-free, while the results for the full conformal and jackknife+ methods rely on strong assumptions.
- Score: 2.3072402651280517
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Training-conditional coverage guarantees in conformal prediction concern the concentration of the error distribution, conditional on the training data, below some nominal level. The conformal prediction methodology has recently been generalized to the covariate shift setting, namely, the covariate distribution changes between the training and test data. In this paper, we study the training-conditional coverage properties of a range of conformal prediction methods under covariate shift via a weighted version of the Dvoretzky-Kiefer-Wolfowitz (DKW) inequality tailored for distribution change. The result for the split conformal method is almost assumption-free, while the results for the full conformal and jackknife+ methods rely on strong assumptions including the uniform stability of the training algorithm.
Related papers
- Generalization and Informativeness of Weighted Conformal Risk Control Under Covariate Shift [40.43703709267958]
Weighted conformal risk control (W-CRC) uses data collected during the training phase to convert point predictions into prediction sets with valid risk guarantees at test time.
While W-CRC provides statistical reliability, its efficiency -- measured by the size of the prediction sets -- can only be assessed at test time.
arXiv Detail & Related papers (2025-01-20T11:26:36Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Distribution-free Conformal Prediction for Ordinal Classification [0.0]
Ordinal classification is common in real applications where the target variable has natural ordering among the class labels.
New conformal prediction methods are developed for constructing contiguous and non-contiguous prediction sets.
arXiv Detail & Related papers (2024-04-25T13:49:59Z) - Training-Conditional Coverage Bounds for Uniformly Stable Learning Algorithms [2.3072402651280517]
We study the training-conditional coverage bounds of full-conformal, jackknife+, and CV+ prediction regions.
We derive coverage bounds for finite-dimensional models by a concentration argument for the (estimated) predictor function.
arXiv Detail & Related papers (2024-04-21T18:18:34Z) - Conditional validity of heteroskedastic conformal regression [12.905195278168506]
Conformal prediction and split conformal prediction offer a distribution-free approach to estimating prediction intervals with statistical guarantees.
Recent work has shown that split conformal prediction can produce state-of-the-art prediction intervals when focusing on marginal coverage.
This paper tries to shed new light on how prediction intervals can be constructed, using methods such as normalized and Mondrian conformal prediction.
arXiv Detail & Related papers (2023-09-15T11:10:46Z) - Conformal Prediction for Federated Uncertainty Quantification Under
Label Shift [57.54977668978613]
Federated Learning (FL) is a machine learning framework where many clients collaboratively train models.
We develop a new conformal prediction method based on quantile regression and take into account privacy constraints.
arXiv Detail & Related papers (2023-06-08T11:54:58Z) - Adapting to Continuous Covariate Shift via Online Density Ratio Estimation [64.8027122329609]
Dealing with distribution shifts is one of the central challenges for modern machine learning.
We propose an online method that can appropriately reuse historical information.
Our density ratio estimation method is proven to perform well by enjoying a dynamic regret bound.
arXiv Detail & Related papers (2023-02-06T04:03:33Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z) - A One-step Approach to Covariate Shift Adaptation [82.01909503235385]
A default assumption in many machine learning scenarios is that the training and test samples are drawn from the same probability distribution.
We propose a novel one-step approach that jointly learns the predictive model and the associated weights in one optimization.
arXiv Detail & Related papers (2020-07-08T11:35:47Z) - Balance-Subsampled Stable Prediction [55.13512328954456]
We propose a novel balance-subsampled stable prediction (BSSP) algorithm based on the theory of fractional factorial design.
A design-theoretic analysis shows that the proposed method can reduce the confounding effects among predictors induced by the distribution shift.
Numerical experiments on both synthetic and real-world data sets demonstrate that our BSSP algorithm significantly outperforms the baseline methods for stable prediction across unknown test data.
arXiv Detail & Related papers (2020-06-08T07:01:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.