Training-Conditional Coverage Bounds for Uniformly Stable Learning Algorithms
- URL: http://arxiv.org/abs/2404.13731v1
- Date: Sun, 21 Apr 2024 18:18:34 GMT
- Title: Training-Conditional Coverage Bounds for Uniformly Stable Learning Algorithms
- Authors: Mehrdad Pournaderi, Yu Xiang,
- Abstract summary: We study the training-conditional coverage bounds of full-conformal, jackknife+, and CV+ prediction regions.
We derive coverage bounds for finite-dimensional models by a concentration argument for the (estimated) predictor function.
- Score: 2.3072402651280517
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The training-conditional coverage performance of the conformal prediction is known to be empirically sound. Recently, there have been efforts to support this observation with theoretical guarantees. The training-conditional coverage bounds for jackknife+ and full-conformal prediction regions have been established via the notion of $(m,n)$-stability by Liang and Barber~[2023]. Although this notion is weaker than uniform stability, it is not clear how to evaluate it for practical models. In this paper, we study the training-conditional coverage bounds of full-conformal, jackknife+, and CV+ prediction regions from a uniform stability perspective which is known to hold for empirical risk minimization over reproducing kernel Hilbert spaces with convex regularization. We derive coverage bounds for finite-dimensional models by a concentration argument for the (estimated) predictor function, and compare the bounds with existing ones under ridge regression.
Related papers
- Epistemic Uncertainty in Conformal Scores: A Unified Approach [2.449909275410288]
Conformal prediction methods create prediction bands with distribution-free guarantees but do not explicitly capture uncertainty.
We introduce $texttEPICSCORE$, a model-agnostic approach that enhances any conformal score by explicitly integrating uncertainty.
$texttEPICSCORE$ adaptively expands predictive intervals in regions with limited data while maintaining compact intervals where data is abundant.
arXiv Detail & Related papers (2025-02-10T19:42:54Z) - Conformal Prediction Sets with Improved Conditional Coverage using Trust Scores [52.92618442300405]
It is impossible to achieve exact, distribution-free conditional coverage in finite samples.
We propose an alternative conformal prediction algorithm that targets coverage where it matters most.
arXiv Detail & Related papers (2025-01-17T12:01:56Z) - Statistical Inference for Temporal Difference Learning with Linear Function Approximation [62.69448336714418]
We study the consistency properties of TD learning with Polyak-Ruppert averaging and linear function approximation.
First, we derive a novel high-dimensional probability convergence guarantee that depends explicitly on the variance and holds under weak conditions.
We further establish refined high-dimensional Berry-Esseen bounds over the class of convex sets that guarantee faster rates than those in the literature.
arXiv Detail & Related papers (2024-10-21T15:34:44Z) - Adjusting Regression Models for Conditional Uncertainty Calibration [46.69079637538012]
We propose a novel algorithm to train a regression function to improve the conditional coverage after applying the split conformal prediction procedure.
We establish an upper bound for the miscoverage gap between the conditional coverage and the nominal coverage rate and propose an end-to-end algorithm to control this upper bound.
arXiv Detail & Related papers (2024-09-26T01:55:45Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Training-Conditional Coverage Bounds under Covariate Shift [2.3072402651280517]
We study the training-conditional coverage properties of a range of conformal prediction methods.
Results for the split conformal method are almost assumption-free, while the results for the full conformal and jackknife+ methods rely on strong assumptions.
arXiv Detail & Related papers (2024-05-26T15:07:16Z) - Towards Continual Learning Desiderata via HSIC-Bottleneck
Orthogonalization and Equiangular Embedding [55.107555305760954]
We propose a conceptually simple yet effective method that attributes forgetting to layer-wise parameter overwriting and the resulting decision boundary distortion.
Our method achieves competitive accuracy performance, even with absolute superiority of zero exemplar buffer and 1.02x the base model.
arXiv Detail & Related papers (2024-01-17T09:01:29Z) - Calibrating Neural Simulation-Based Inference with Differentiable
Coverage Probability [50.44439018155837]
We propose to include a calibration term directly into the training objective of the neural model.
By introducing a relaxation of the classical formulation of calibration error we enable end-to-end backpropagation.
It is directly applicable to existing computational pipelines allowing reliable black-box posterior inference.
arXiv Detail & Related papers (2023-10-20T10:20:45Z) - Exploring the Training Robustness of Distributional Reinforcement
Learning against Noisy State Observations [7.776010676090131]
State observations that an agent observes may contain measurement errors or adversarial noises, misleading the agent to take suboptimal actions or even collapse while training.
In this paper, we study the training robustness of distributional Reinforcement Learning (RL), a class of state-of-the-art methods that estimate the whole distribution, as opposed to only the expectation, of the total return.
arXiv Detail & Related papers (2021-09-17T22:37:39Z) - Conformalized Survival Analysis [6.92027612631023]
Existing survival analysis techniques heavily rely on strong modelling assumptions.
We develop an inferential method based on ideas from conformal prediction.
The validity and efficiency of our procedure are demonstrated on synthetic data and real COVID-19 data from the UK Biobank.
arXiv Detail & Related papers (2021-03-17T16:32:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.