Online Multivalid Learning: Means, Moments, and Prediction Intervals
- URL: http://arxiv.org/abs/2101.01739v1
- Date: Tue, 5 Jan 2021 19:08:11 GMT
- Title: Online Multivalid Learning: Means, Moments, and Prediction Intervals
- Authors: Varun Gupta, Christopher Jung, Georgy Noarov, Mallesh M. Pai, Aaron
Roth
- Abstract summary: We present a technique for providing contextual predictions that are "multivalid" in various senses.
The resulting estimates correctly predict various statistics of the labels $y$ not just marginally.
Because our algorithms handle adversarially chosen examples, they can equally well be used to predict statistics of the residuals of arbitrary point prediction methods.
- Score: 16.75129633574157
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a general, efficient technique for providing contextual
predictions that are "multivalid" in various senses, against an online sequence
of adversarially chosen examples $(x,y)$. This means that the resulting
estimates correctly predict various statistics of the labels $y$ not just
marginally -- as averaged over the sequence of examples -- but also
conditionally on $x \in G$ for any $G$ belonging to an arbitrary intersecting
collection of groups $\mathcal{G}$.
We provide three instantiations of this framework. The first is mean
prediction, which corresponds to an online algorithm satisfying the notion of
multicalibration from Hebert-Johnson et al. The second is variance and higher
moment prediction, which corresponds to an online algorithm satisfying the
notion of mean-conditioned moment multicalibration from Jung et al. Finally, we
define a new notion of prediction interval multivalidity, and give an algorithm
for finding prediction intervals which satisfy it. Because our algorithms
handle adversarially chosen examples, they can equally well be used to predict
statistics of the residuals of arbitrary point prediction methods, giving rise
to very general techniques for quantifying the uncertainty of predictions of
black box algorithms, even in an online adversarial setting. When instantiated
for prediction intervals, this solves a similar problem as conformal
prediction, but in an adversarial environment and with multivalidity guarantees
stronger than simple marginal coverage guarantees.
Related papers
- Conformalized Interval Arithmetic with Symmetric Calibration [9.559062601251464]
We develop conformal prediction intervals for single target to the prediction interval for sum of multiple targets.
We show that our method outperforms existing conformalized approaches as well as non-conformal approaches.
arXiv Detail & Related papers (2024-08-20T15:27:18Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Mind the Gap: A Causal Perspective on Bias Amplification in Prediction & Decision-Making [58.06306331390586]
We introduce the notion of a margin complement, which measures how much a prediction score $S$ changes due to a thresholding operation.
We show that under suitable causal assumptions, the influences of $X$ on the prediction score $S$ are equal to the influences of $X$ on the true outcome $Y$.
arXiv Detail & Related papers (2024-05-24T11:22:19Z) - PAC Prediction Sets Under Label Shift [52.30074177997787]
Prediction sets capture uncertainty by predicting sets of labels rather than individual labels.
We propose a novel algorithm for constructing prediction sets with PAC guarantees in the label shift setting.
We evaluate our approach on five datasets.
arXiv Detail & Related papers (2023-10-19T17:57:57Z) - Sorting and Hypergraph Orientation under Uncertainty with Predictions [0.45880283710344055]
We study learning-augmented algorithms for sorting and hypergraph orientation under uncertainty.
Our algorithms provide improved performance guarantees for accurate predictions while maintaining worst-case guarantees that are best possible without predictions.
arXiv Detail & Related papers (2023-05-16T07:52:08Z) - Post-selection Inference for Conformal Prediction: Trading off Coverage
for Precision [0.0]
Traditionally, conformal prediction inference requires a data-independent specification of miscoverage level.
We develop simultaneous conformal inference to account for data-dependent miscoverage levels.
arXiv Detail & Related papers (2023-04-12T20:56:43Z) - Mixing predictions for online metric algorithms [34.849039387367455]
We design algorithms that combine predictions and are competitive against such dynamic combinations.
Our algorithms can be adapted to access predictors in a bandit-like fashion, querying only one predictor at a time.
An unexpected implication of one of our lower bounds is a new structural insight about covering formulations for the $k$-server problem.
arXiv Detail & Related papers (2023-04-04T13:18:00Z) - Efficient and Differentiable Conformal Prediction with General Function
Classes [96.74055810115456]
We propose a generalization of conformal prediction to multiple learnable parameters.
We show that it achieves approximate valid population coverage and near-optimal efficiency within class.
Experiments show that our algorithm is able to learn valid prediction sets and improve the efficiency significantly.
arXiv Detail & Related papers (2022-02-22T18:37:23Z) - Almost Tight L0-norm Certified Robustness of Top-k Predictions against
Adversarial Perturbations [78.23408201652984]
Top-k predictions are used in many real-world applications such as machine learning as a service, recommender systems, and web searches.
Our work is based on randomized smoothing, which builds a provably robust classifier via randomizing an input.
For instance, our method can build a classifier that achieves a certified top-3 accuracy of 69.2% on ImageNet when an attacker can arbitrarily perturb 5 pixels of a testing image.
arXiv Detail & Related papers (2020-11-15T21:34:44Z) - Consistent Structured Prediction with Max-Min Margin Markov Networks [84.60515484036239]
Max-margin methods for binary classification have been extended to the structured prediction setting under the name of max-margin Markov networks ($M3N$)
We overcome such limitations by defining the learning problem in terms of a "max-min" margin formulation, naming the resulting method max-min margin Markov networks ($M4N$)
Experiments on multi-class classification, ordinal regression, sequence prediction and ranking demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2020-07-02T10:48:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.