Non-Exchangeable Conformal Risk Control
- URL: http://arxiv.org/abs/2310.01262v2
- Date: Fri, 26 Jan 2024 10:47:18 GMT
- Title: Non-Exchangeable Conformal Risk Control
- Authors: Ant\'onio Farinhas, Chrysoula Zerva, Dennis Ulmer, Andr\'e F. T.
Martins
- Abstract summary: Split conformal prediction has recently sparked great interest due to its ability to provide formally guaranteed uncertainty sets or intervals.
We propose non-exchangeable conformal risk control, which allows controlling the expected value of any monotone loss function when the data is not exchangeable.
Our framework is flexible, makes very few assumptions, and allows weighting the data based on its relevance for a given test example.
- Score: 12.381447108228635
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Split conformal prediction has recently sparked great interest due to its
ability to provide formally guaranteed uncertainty sets or intervals for
predictions made by black-box neural models, ensuring a predefined probability
of containing the actual ground truth. While the original formulation assumes
data exchangeability, some extensions handle non-exchangeable data, which is
often the case in many real-world scenarios. In parallel, some progress has
been made in conformal methods that provide statistical guarantees for a
broader range of objectives, such as bounding the best $F_1$-score or
minimizing the false negative rate in expectation. In this paper, we leverage
and extend these two lines of work by proposing non-exchangeable conformal risk
control, which allows controlling the expected value of any monotone loss
function when the data is not exchangeable. Our framework is flexible, makes
very few assumptions, and allows weighting the data based on its relevance for
a given test example; a careful choice of weights may result on tighter bounds,
making our framework useful in the presence of change points, time series, or
other forms of distribution drift. Experiments with both synthetic and real
world data show the usefulness of our method.
Related papers
- Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Data-Adaptive Tradeoffs among Multiple Risks in Distribution-Free Prediction [55.77015419028725]
We develop methods that permit valid control of risk when threshold and tradeoff parameters are chosen adaptively.
Our methodology supports monotone and nearly-monotone risks, but otherwise makes no distributional assumptions.
arXiv Detail & Related papers (2024-03-28T17:28:06Z) - Efficient Conformal Prediction under Data Heterogeneity [79.35418041861327]
Conformal Prediction (CP) stands out as a robust framework for uncertainty quantification.
Existing approaches for tackling non-exchangeability lead to methods that are not computable beyond the simplest examples.
This work introduces a new efficient approach to CP that produces provably valid confidence sets for fairly general non-exchangeable data distributions.
arXiv Detail & Related papers (2023-12-25T20:02:51Z) - User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems [49.75149094527068]
We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
arXiv Detail & Related papers (2023-06-13T03:42:03Z) - The Decaying Missing-at-Random Framework: Doubly Robust Causal Inference
with Partially Labeled Data [10.021381302215062]
In real-world scenarios, data collection limitations often result in partially labeled datasets, leading to difficulties in drawing reliable causal inferences.
Traditional approaches in the semi-parametric (SS) and missing data literature may not adequately handle these complexities, leading to biased estimates.
This framework tackles missing outcomes in high-dimensional settings and accounts for selection bias.
arXiv Detail & Related papers (2023-05-22T07:37:12Z) - Distribution-Free Finite-Sample Guarantees and Split Conformal
Prediction [0.0]
split conformal prediction represents a promising avenue to obtain finite-sample guarantees under minimal distribution-free assumptions.
We highlight the connection between split conformal prediction and classical tolerance predictors developed in the 1940s.
arXiv Detail & Related papers (2022-10-26T14:12:24Z) - A general framework for multi-step ahead adaptive conformal
heteroscedastic time series forecasting [0.0]
This paper introduces a novel model-agnostic algorithm called adaptive ensemble batch multi-input multi-output conformalized quantile regression (AEnbMIMOCQR)
It enables forecasters to generate multi-step ahead prediction intervals for a fixed pre-specified miscoverage rate in a distribution-free manner.
Our method is grounded on conformal prediction principles, however, it does not require data splitting and provides close to exact coverage even when the data is not exchangeable.
arXiv Detail & Related papers (2022-07-28T16:40:26Z) - Conformal Off-Policy Prediction in Contextual Bandits [54.67508891852636]
Conformal off-policy prediction can output reliable predictive intervals for the outcome under a new target policy.
We provide theoretical finite-sample guarantees without making any additional assumptions beyond the standard contextual bandit setup.
arXiv Detail & Related papers (2022-06-09T10:39:33Z) - Selective Regression Under Fairness Criteria [30.672082160544996]
In some cases, the performance of minority group can decrease while we reduce the coverage.
We show that such an unwanted behavior can be avoided if we can construct features satisfying the sufficiency criterion.
arXiv Detail & Related papers (2021-10-28T19:05:12Z) - Trust but Verify: Assigning Prediction Credibility by Counterfactual
Constrained Learning [123.3472310767721]
Prediction credibility measures are fundamental in statistics and machine learning.
These measures should account for the wide variety of models used in practice.
The framework developed in this work expresses the credibility as a risk-fit trade-off.
arXiv Detail & Related papers (2020-11-24T19:52:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.