Universality of conformal prediction under the assumption of randomness
- URL: http://arxiv.org/abs/2502.19254v2
- Date: Sun, 08 Jun 2025 20:05:17 GMT
- Title: Universality of conformal prediction under the assumption of randomness
- Authors: Vladimir Vovk,
- Abstract summary: Conformal predictors provide set or functional predictions that are valid under the assumption of randomness.<n>The question is whether there are predictors that are valid in the same sense under the assumption of randomness and that are more efficient than conformal predictors.<n>The answer is that the class of conformal predictors is universal in that only limited gains in predictive efficiency are possible.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conformal predictors provide set or functional predictions that are valid under the assumption of randomness, i.e., under the assumption of independent and identically distributed data. The question asked in this paper is whether there are predictors that are valid in the same sense under the assumption of randomness and that are more efficient than conformal predictors. The answer is that the class of conformal predictors is universal in that only limited gains in predictive efficiency are possible. The previous work in this area has relied on the algorithmic theory of randomness and so involved unspecified constants, whereas this paper's results are much more practical. They are also shown to be optimal in some respects.
Related papers
- Optimal Conformal Prediction under Epistemic Uncertainty [61.46247583794497]
Conformal prediction (CP) is a popular framework for representing uncertainty.<n>We introduce Bernoulli prediction sets (BPS) which produce the smallest prediction sets that ensure conditional coverage.<n>When given first-order predictions, BPS reduces to the well-known adaptive prediction sets (APS)
arXiv Detail & Related papers (2025-05-25T08:32:44Z) - Inductive randomness predictors: beyond conformal [0.0]
This paper introduces inductive randomness predictors, which form a proper superset of inductive conformal predictors.<n>It turns out that every non-trivial inductive conformal predictor is strictly dominated by an inductive randomness predictor.
arXiv Detail & Related papers (2025-03-04T17:26:25Z) - Conformal Prediction Sets with Improved Conditional Coverage using Trust Scores [52.92618442300405]
It is impossible to achieve exact, distribution-free conditional coverage in finite samples.
We propose an alternative conformal prediction algorithm that targets coverage where it matters most.
arXiv Detail & Related papers (2025-01-17T12:01:56Z) - Efficient pooling of predictions via kernel embeddings [0.24578723416255752]
Probabilistic predictions are probability distributions over the set of possible outcomes.
They are typically combined by linearly pooling the individual predictive distributions.
Weights assigned to each prediction can be estimated based on their past performance.
This can be achieved by finding the weights that optimise a proper scoring rule over some training data.
arXiv Detail & Related papers (2024-11-25T10:04:37Z) - Conformal Generative Modeling with Improved Sample Efficiency through Sequential Greedy Filtering [55.15192437680943]
Generative models lack rigorous statistical guarantees for their outputs.<n>We propose a sequential conformal prediction method producing prediction sets that satisfy a rigorous statistical guarantee.<n>This guarantee states that with high probability, the prediction sets contain at least one admissible (or valid) example.
arXiv Detail & Related papers (2024-10-02T15:26:52Z) - Calibrated Probabilistic Forecasts for Arbitrary Sequences [58.54729945445505]
Real-world data streams can change unpredictably due to distribution shifts, feedback loops and adversarial actors.
We present a forecasting framework ensuring valid uncertainty estimates regardless of how data evolves.
arXiv Detail & Related papers (2024-09-27T21:46:42Z) - Beyond Conformal Predictors: Adaptive Conformal Inference with Confidence Predictors [0.0]
Conformal prediction requires exchangeable data to ensure valid prediction sets at a user-specified significance level.
Adaptive conformal inference (ACI) was introduced to address this limitation.
We show that ACI does not require the use of conformal predictors; instead, it can be implemented with the more general confidence predictors.
arXiv Detail & Related papers (2024-09-23T21:02:33Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Invariant Probabilistic Prediction [45.90606906307022]
We show that arbitrary distribution shifts do not, in general, admit invariant and robust probabilistic predictions.
We propose a method to yield invariant probabilistic predictions, called IPP, and study the consistency of the underlying parameters.
arXiv Detail & Related papers (2023-09-18T18:50:24Z) - On the Expected Size of Conformal Prediction Sets [24.161372736642157]
We theoretically quantify the expected size of the prediction sets under the split conformal prediction framework.
As this precise formulation cannot usually be calculated directly, we derive point estimates and high-probability bounds interval.
We corroborate the efficacy of our results with experiments on real-world datasets for both regression and classification problems.
arXiv Detail & Related papers (2023-06-12T17:22:57Z) - Distribution-Free Finite-Sample Guarantees and Split Conformal
Prediction [0.0]
split conformal prediction represents a promising avenue to obtain finite-sample guarantees under minimal distribution-free assumptions.
We highlight the connection between split conformal prediction and classical tolerance predictors developed in the 1940s.
arXiv Detail & Related papers (2022-10-26T14:12:24Z) - Predictive Inference with Feature Conformal Prediction [80.77443423828315]
We propose feature conformal prediction, which extends the scope of conformal prediction to semantic feature spaces.
From a theoretical perspective, we demonstrate that feature conformal prediction provably outperforms regular conformal prediction under mild assumptions.
Our approach could be combined with not only vanilla conformal prediction, but also other adaptive conformal prediction methods.
arXiv Detail & Related papers (2022-10-01T02:57:37Z) - Probabilistic Conformal Prediction Using Conditional Random Samples [73.26753677005331]
PCP is a predictive inference algorithm that estimates a target variable by a discontinuous predictive set.
It is efficient and compatible with either explicit or implicit conditional generative models.
arXiv Detail & Related papers (2022-06-14T03:58:03Z) - Selective Regression Under Fairness Criteria [30.672082160544996]
In some cases, the performance of minority group can decrease while we reduce the coverage.
We show that such an unwanted behavior can be avoided if we can construct features satisfying the sufficiency criterion.
arXiv Detail & Related papers (2021-10-28T19:05:12Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z) - Optimized conformal classification using gradient descent approximation [0.2538209532048866]
Conformal predictors allow predictions to be made with a user-defined confidence level.
We consider an approach to train the conformal predictor directly with maximum predictive efficiency.
We test the method on several real world data sets and find that the method is promising.
arXiv Detail & Related papers (2021-05-24T13:14:41Z) - Learning Probabilistic Ordinal Embeddings for Uncertainty-Aware
Regression [91.3373131262391]
Uncertainty is the only certainty there is.
Traditionally, the direct regression formulation is considered and the uncertainty is modeled by modifying the output space to a certain family of probabilistic distributions.
How to model the uncertainty within the present-day technologies for regression remains an open issue.
arXiv Detail & Related papers (2021-03-25T06:56:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.