Model uncertainty quantification using feature confidence sets for outcome excursions
- URL: http://arxiv.org/abs/2504.19464v1
- Date: Mon, 28 Apr 2025 04:08:07 GMT
- Title: Model uncertainty quantification using feature confidence sets for outcome excursions
- Authors: Junting Ren, Armin Schwartzman,
- Abstract summary: This paper introduces a novel, model-agnostic framework for quantifying uncertainty in continuous and binary outcomes.<n>It is validated through simulations and applied to real-world datasets in contexts such as housing price prediction and time to sepsis diagnosis in healthcare.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: When implementing prediction models for high-stakes real-world applications such as medicine, finance, and autonomous systems, quantifying prediction uncertainty is critical for effective risk management. Traditional approaches to uncertainty quantification, such as confidence and prediction intervals, provide probability coverage guarantees for the expected outcomes $f(\boldsymbol{x})$ or the realized outcomes $f(\boldsymbol{x})+\epsilon$. Instead, this paper introduces a novel, model-agnostic framework for quantifying uncertainty in continuous and binary outcomes using confidence sets for outcome excursions, where the goal is to identify a subset of the feature space where the expected or realized outcome exceeds a specific value. The proposed method constructs data-dependent inner and outer confidence sets that aim to contain the true feature subset for which the expected or realized outcomes of these features exceed a specified threshold. We establish theoretical guarantees for the probability that these confidence sets contain the true feature subset, both asymptotically and for finite sample sizes. The framework is validated through simulations and applied to real-world datasets, demonstrating its utility in contexts such as housing price prediction and time to sepsis diagnosis in healthcare. This approach provides a unified method for uncertainty quantification that is broadly applicable across various continuous and binary prediction models.
Related papers
- Epistemic Uncertainty in Conformal Scores: A Unified Approach [2.449909275410288]
Conformal prediction methods create prediction bands with distribution-free guarantees but do not explicitly capture uncertainty.<n>We introduce $texttEPICSCORE$, a model-agnostic approach that enhances any conformal score by explicitly integrating uncertainty.<n>$texttEPICSCORE$ adaptively expands predictive intervals in regions with limited data while maintaining compact intervals where data is abundant.
arXiv Detail & Related papers (2025-02-10T19:42:54Z) - Beyond Conformal Predictors: Adaptive Conformal Inference with Confidence Predictors [0.0]
Conformal prediction requires exchangeable data to ensure valid prediction sets at a user-specified significance level.
Adaptive conformal inference (ACI) was introduced to address this limitation.
We show that ACI does not require the use of conformal predictors; instead, it can be implemented with the more general confidence predictors.
arXiv Detail & Related papers (2024-09-23T21:02:33Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Tractable Function-Space Variational Inference in Bayesian Neural
Networks [72.97620734290139]
A popular approach for estimating the predictive uncertainty of neural networks is to define a prior distribution over the network parameters.
We propose a scalable function-space variational inference method that allows incorporating prior information.
We show that the proposed method leads to state-of-the-art uncertainty estimation and predictive performance on a range of prediction tasks.
arXiv Detail & Related papers (2023-12-28T18:33:26Z) - Score Matching-based Pseudolikelihood Estimation of Neural Marked
Spatio-Temporal Point Process with Uncertainty Quantification [59.81904428056924]
We introduce SMASH: a Score MAtching estimator for learning markedPs with uncertainty quantification.
Specifically, our framework adopts a normalization-free objective by estimating the pseudolikelihood of markedPs through score-matching.
The superior performance of our proposed framework is demonstrated through extensive experiments in both event prediction and uncertainty quantification.
arXiv Detail & Related papers (2023-10-25T02:37:51Z) - Ensemble Neural Networks for Remaining Useful Life (RUL) Prediction [0.39287497907611874]
A core part of maintenance planning is a monitoring system that provides a good prognosis on health and degradation.
Here, we propose ensemble neural networks for probabilistic RUL predictions which considers both uncertainties and decouples these two uncertainties.
This method is tested on NASA's turbofan jet engine CMAPSS data-set.
arXiv Detail & Related papers (2023-09-21T19:38:44Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Model-free generalized fiducial inference [0.0]
I propose and develop ideas for a model-free statistical framework for imprecise probabilistic prediction inference.
This framework facilitates uncertainty quantification in the form of prediction sets that offer finite sample control of type 1 errors.
I consider the theoretical and empirical properties of a precise probabilistic approximation to the model-free imprecise framework.
arXiv Detail & Related papers (2023-07-24T01:58:48Z) - Dense Uncertainty Estimation via an Ensemble-based Conditional Latent
Variable Model [68.34559610536614]
We argue that the aleatoric uncertainty is an inherent attribute of the data and can only be correctly estimated with an unbiased oracle model.
We propose a new sampling and selection strategy at train time to approximate the oracle model for aleatoric uncertainty estimation.
Our results show that our solution achieves both accurate deterministic results and reliable uncertainty estimation.
arXiv Detail & Related papers (2021-11-22T08:54:10Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.