Uncertainty quantification for probabilistic machine learning in earth
observation using conformal prediction
- URL: http://arxiv.org/abs/2401.06421v1
- Date: Fri, 12 Jan 2024 07:31:21 GMT
- Title: Uncertainty quantification for probabilistic machine learning in earth
observation using conformal prediction
- Authors: Geethen Singh, Glenn Moncrieff, Zander Venter, Kerry Cawse-Nicholson,
Jasper Slingsby and Tamara B Robinson
- Abstract summary: Unreliable predictions can occur when using artificial intelligence (AI) systems with negative consequences for downstream applications.
Conformal prediction provides a model-agnostic framework for uncertainty quantification that can be applied to any dataset.
In response to the increased need to report uncertainty alongside point predictions, we bring attention to the promise of conformal prediction in Earth Observation applications.
- Score: 0.22265536092123003
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Unreliable predictions can occur when using artificial intelligence (AI)
systems with negative consequences for downstream applications, particularly
when employed for decision-making. Conformal prediction provides a
model-agnostic framework for uncertainty quantification that can be applied to
any dataset, irrespective of its distribution, post hoc. In contrast to other
pixel-level uncertainty quantification methods, conformal prediction operates
without requiring access to the underlying model and training dataset,
concurrently offering statistically valid and informative prediction regions,
all while maintaining computational efficiency. In response to the increased
need to report uncertainty alongside point predictions, we bring attention to
the promise of conformal prediction within the domain of Earth Observation (EO)
applications. To accomplish this, we assess the current state of uncertainty
quantification in the EO domain and found that only 20% of the reviewed Google
Earth Engine (GEE) datasets incorporated a degree of uncertainty information,
with unreliable methods prevalent. Next, we introduce modules that seamlessly
integrate into existing GEE predictive modelling workflows and demonstrate the
application of these tools for datasets spanning local to global scales,
including the Dynamic World and Global Ecosystem Dynamics Investigation (GEDI)
datasets. These case studies encompass regression and classification tasks,
featuring both traditional and deep learning-based workflows. Subsequently, we
discuss the opportunities arising from the use of conformal prediction in EO.
We anticipate that the increased availability of easy-to-use implementations of
conformal predictors, such as those provided here, will drive wider adoption of
rigorous uncertainty quantification in EO, thereby enhancing the reliability of
uses such as operational monitoring and decision making.
Related papers
- Model uncertainty quantification using feature confidence sets for outcome excursions [0.0]
This paper introduces a novel, model-agnostic framework for quantifying uncertainty in continuous and binary outcomes.
It is validated through simulations and applied to real-world datasets in contexts such as housing price prediction and time to sepsis diagnosis in healthcare.
arXiv Detail & Related papers (2025-04-28T04:08:07Z) - Calibrated Probabilistic Forecasts for Arbitrary Sequences [58.54729945445505]
Real-world data streams can change unpredictably due to distribution shifts, feedback loops and adversarial actors.
We present a forecasting framework ensuring valid uncertainty estimates regardless of how data evolves.
arXiv Detail & Related papers (2024-09-27T21:46:42Z) - Conditional Shift-Robust Conformal Prediction for Graph Neural Network [0.0]
Graph Neural Networks (GNNs) have emerged as potent tools for predicting outcomes in graph-structured data.
Despite their efficacy, GNNs have limited ability to provide robust uncertainty estimates.
We propose Conditional Shift Robust (CondSR) conformal prediction for GNNs.
arXiv Detail & Related papers (2024-05-20T11:47:31Z) - Introducing an Improved Information-Theoretic Measure of Predictive
Uncertainty [6.3398383724486544]
Predictive uncertainty is commonly measured by the entropy of the Bayesian model average (BMA) predictive distribution.
We introduce a theoretically grounded measure to overcome these limitations.
We find that our introduced measure behaves more reasonably in controlled synthetic tasks.
arXiv Detail & Related papers (2023-11-14T16:55:12Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Lightweight, Uncertainty-Aware Conformalized Visual Odometry [2.429910016019183]
Data-driven visual odometry (VO) is a critical subroutine for autonomous edge robotics.
Emerging edge robotics devices like insect-scale drones and surgical robots lack a computationally efficient framework to estimate VO's predictive uncertainties.
This paper presents a novel, lightweight, and statistically robust framework that leverages conformal inference (CI) to extract VO's uncertainty bands.
arXiv Detail & Related papers (2023-03-03T20:37:55Z) - Improving Adaptive Conformal Prediction Using Self-Supervised Learning [72.2614468437919]
We train an auxiliary model with a self-supervised pretext task on top of an existing predictive model and use the self-supervised error as an additional feature to estimate nonconformity scores.
We empirically demonstrate the benefit of the additional information using both synthetic and real data on the efficiency (width), deficit, and excess of conformal prediction intervals.
arXiv Detail & Related papers (2023-02-23T18:57:14Z) - Interpretable Self-Aware Neural Networks for Robust Trajectory
Prediction [50.79827516897913]
We introduce an interpretable paradigm for trajectory prediction that distributes the uncertainty among semantic concepts.
We validate our approach on real-world autonomous driving data, demonstrating superior performance over state-of-the-art baselines.
arXiv Detail & Related papers (2022-11-16T06:28:20Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Probabilistic Deep Learning for Instance Segmentation [9.62543698736491]
We propose a generic method to obtain model-inherent uncertainty estimates within proposal-free instance segmentation models.
We evaluate our method on the BBBC010 C. elegans dataset, where it yields competitive performance.
arXiv Detail & Related papers (2020-08-24T19:51:48Z) - Estimation with Uncertainty via Conditional Generative Adversarial
Networks [3.829070379776576]
We propose a predictive probabilistic neural network model, which corresponds to a different manner of using the generator in conditional Generative Adversarial Network (cGAN)
By reversing the input and output of ordinary cGAN, the model can be successfully used as a predictive model.
In addition, to measure the uncertainty of predictions, we introduce the entropy and relative entropy for regression problems and classification problems.
arXiv Detail & Related papers (2020-07-01T08:54:17Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.