Gradient-Based Quantification of Epistemic Uncertainty for Deep Object
Detectors
- URL: http://arxiv.org/abs/2107.04517v1
- Date: Fri, 9 Jul 2021 16:04:11 GMT
- Title: Gradient-Based Quantification of Epistemic Uncertainty for Deep Object
Detectors
- Authors: Tobias Riedlinger, Matthias Rottmann, Marius Schubert, Hanno
Gottschalk
- Abstract summary: We introduce novel gradient-based uncertainty metrics and investigate them for different object detection architectures.
Experiments show significant improvements in true positive / false positive discrimination and prediction of intersection over union.
We also find improvement over Monte-Carlo dropout uncertainty metrics and further significant boosts by aggregating different sources of uncertainty metrics.
- Score: 8.029049649310213
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reliable epistemic uncertainty estimation is an essential component for
backend applications of deep object detectors in safety-critical environments.
Modern network architectures tend to give poorly calibrated confidences with
limited predictive power. Here, we introduce novel gradient-based uncertainty
metrics and investigate them for different object detection architectures.
Experiments on the MS COCO, PASCAL VOC and the KITTI dataset show significant
improvements in true positive / false positive discrimination and prediction of
intersection over union as compared to network confidence. We also find
improvement over Monte-Carlo dropout uncertainty metrics and further
significant boosts by aggregating different sources of uncertainty metrics.The
resulting uncertainty models generate well-calibrated confidences in all
instances. Furthermore, we implement our uncertainty quantification models into
object detection pipelines as a means to discern true against false
predictions, replacing the ordinary score-threshold-based decision rule. In our
experiments, we achieve a significant boost in detection performance in terms
of mean average precision. With respect to computational complexity, we find
that computing gradient uncertainty metrics results in floating point operation
counts similar to those of Monte-Carlo dropout.
Related papers
- Revisiting Confidence Estimation: Towards Reliable Failure Prediction [53.79160907725975]
We find a general, widely existing but actually-neglected phenomenon that most confidence estimation methods are harmful for detecting misclassification errors.
We propose to enlarge the confidence gap by finding flat minima, which yields state-of-the-art failure prediction performance.
arXiv Detail & Related papers (2024-03-05T11:44:14Z) - Mutual Information-calibrated Conformal Feature Fusion for
Uncertainty-Aware Multimodal 3D Object Detection at the Edge [1.7898305876314982]
Three-dimensional (3D) object detection, a critical robotics operation, has seen significant advancements.
Our study integrates the principles of conformal inference with information theoretic measures to perform lightweight, Monte Carlo-free uncertainty estimation.
The framework demonstrates comparable or better performance in KITTI 3D object detection benchmarks to similar methods that are not uncertainty-aware.
arXiv Detail & Related papers (2023-09-18T09:02:44Z) - Lightweight, Uncertainty-Aware Conformalized Visual Odometry [2.429910016019183]
Data-driven visual odometry (VO) is a critical subroutine for autonomous edge robotics.
Emerging edge robotics devices like insect-scale drones and surgical robots lack a computationally efficient framework to estimate VO's predictive uncertainties.
This paper presents a novel, lightweight, and statistically robust framework that leverages conformal inference (CI) to extract VO's uncertainty bands.
arXiv Detail & Related papers (2023-03-03T20:37:55Z) - Uncertainty Estimation by Fisher Information-based Evidential Deep
Learning [61.94125052118442]
Uncertainty estimation is a key factor that makes deep learning reliable in practical applications.
We propose a novel method, Fisher Information-based Evidential Deep Learning ($mathcalI$-EDL)
In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes.
arXiv Detail & Related papers (2023-03-03T16:12:59Z) - A Review of Uncertainty Calibration in Pretrained Object Detectors [5.440028715314566]
We investigate the uncertainty calibration properties of different pretrained object detection architectures in a multi-class setting.
We propose a framework to ensure a fair, unbiased, and repeatable evaluation.
We deliver novel insights into why poor detector calibration emerges.
arXiv Detail & Related papers (2022-10-06T14:06:36Z) - On Calibrated Model Uncertainty in Deep Learning [0.0]
We extend the approximate inference for the loss-calibrated Bayesian framework to dropweights based Bayesian neural networks.
We show that decisions informed by loss-calibrated uncertainty can improve diagnostic performance to a greater extent than straightforward alternatives.
arXiv Detail & Related papers (2022-06-15T20:16:32Z) - CertainNet: Sampling-free Uncertainty Estimation for Object Detection [65.28989536741658]
Estimating the uncertainty of a neural network plays a fundamental role in safety-critical settings.
In this work, we propose a novel sampling-free uncertainty estimation method for object detection.
We call it CertainNet, and it is the first to provide separate uncertainties for each output signal: objectness, class, location and size.
arXiv Detail & Related papers (2021-10-04T17:59:31Z) - Uncertainty-Aware Deep Calibrated Salient Object Detection [74.58153220370527]
Existing deep neural network based salient object detection (SOD) methods mainly focus on pursuing high network accuracy.
These methods overlook the gap between network accuracy and prediction confidence, known as the confidence uncalibration problem.
We introduce an uncertaintyaware deep SOD network, and propose two strategies to prevent deep SOD networks from being overconfident.
arXiv Detail & Related papers (2020-12-10T23:28:36Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.