Universal Prediction Band via Semi-Definite Programming
- URL: http://arxiv.org/abs/2103.17203v1
- Date: Wed, 31 Mar 2021 16:30:58 GMT
- Title: Universal Prediction Band via Semi-Definite Programming
- Authors: Tengyuan Liang
- Abstract summary: We propose a method to construct nonparametric, heteroskedastic prediction bands for uncertainty quantification.
The data-adaptive prediction band is universally applicable with minimal distributional assumptions.
- Score: 4.401255328572734
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a computationally efficient method to construct nonparametric,
heteroskedastic prediction bands for uncertainty quantification, with or
without any user-specified predictive model. The data-adaptive prediction band
is universally applicable with minimal distributional assumptions, with strong
non-asymptotic coverage properties, and easy to implement using standard convex
programs. Our approach can be viewed as a novel variance interpolation with
confidence and further leverages techniques from semi-definite programming and
sum-of-squares optimization. Theoretical and numerical performances for the
proposed approach for uncertainty quantification are analyzed.
Related papers
- Conformalized Interval Arithmetic with Symmetric Calibration [9.559062601251464]
We develop conformal prediction intervals for single target to the prediction interval for sum of multiple targets.
We show that our method outperforms existing conformalized approaches as well as non-conformal approaches.
arXiv Detail & Related papers (2024-08-20T15:27:18Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Non-Convex Robust Hypothesis Testing using Sinkhorn Uncertainty Sets [18.46110328123008]
We present a new framework to address the non-robust hypothesis testing problem.
The goal is to seek the optimal detector that minimizes the maximum numerical risk.
arXiv Detail & Related papers (2024-03-21T20:29:43Z) - Conformalized Adaptive Forecasting of Heterogeneous Trajectories [8.022222226139032]
We present a new conformal method for generating simultaneous forecasting bands guaranteed to cover the entire path of a new random trajectory with sufficiently high probability.
This solution is both principled, providing precise finite-sample guarantees, and effective, often leading to more informative predictions than prior methods.
arXiv Detail & Related papers (2024-02-14T23:57:19Z) - Likelihood Ratio Confidence Sets for Sequential Decision Making [51.66638486226482]
We revisit the likelihood-based inference principle and propose to use likelihood ratios to construct valid confidence sequences.
Our method is especially suitable for problems with well-specified likelihoods.
We show how to provably choose the best sequence of estimators and shed light on connections to online convex optimization.
arXiv Detail & Related papers (2023-11-08T00:10:21Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Optimal Learning via Moderate Deviations Theory [4.6930976245638245]
We develop a systematic construction of highly accurate confidence intervals by using a moderate deviation principle-based approach.
It is shown that the proposed confidence intervals are statistically optimal in the sense that they satisfy criteria regarding exponential accuracy, minimality, consistency, mischaracterization probability, and eventual uniformly most accurate (UMA) property.
arXiv Detail & Related papers (2023-05-23T19:57:57Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Efficient Ensemble Model Generation for Uncertainty Estimation with
Bayesian Approximation in Segmentation [74.06904875527556]
We propose a generic and efficient segmentation framework to construct ensemble segmentation models.
In the proposed method, ensemble models can be efficiently generated by using the layer selection method.
We also devise a new pixel-wise uncertainty loss, which improves the predictive performance.
arXiv Detail & Related papers (2020-05-21T16:08:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.