Uncertainty-Aware Deep Calibrated Salient Object Detection
- URL: http://arxiv.org/abs/2012.06020v1
- Date: Thu, 10 Dec 2020 23:28:36 GMT
- Title: Uncertainty-Aware Deep Calibrated Salient Object Detection
- Authors: Jing Zhang, Yuchao Dai, Xin Yu, Mehrtash Harandi, Nick Barnes, Richard
Hartley
- Abstract summary: Existing deep neural network based salient object detection (SOD) methods mainly focus on pursuing high network accuracy.
These methods overlook the gap between network accuracy and prediction confidence, known as the confidence uncalibration problem.
We introduce an uncertaintyaware deep SOD network, and propose two strategies to prevent deep SOD networks from being overconfident.
- Score: 74.58153220370527
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Existing deep neural network based salient object detection (SOD) methods
mainly focus on pursuing high network accuracy. However, those methods overlook
the gap between network accuracy and prediction confidence, known as the
confidence uncalibration problem. Thus, state-of-the-art SOD networks are prone
to be overconfident. In other words, the predicted confidence of the networks
does not reflect the real probability of correctness of salient object
detection, which significantly hinder their real-world applicability. In this
paper, we introduce an uncertaintyaware deep SOD network, and propose two
strategies from different perspectives to prevent deep SOD networks from being
overconfident. The first strategy, namely Boundary Distribution Smoothing
(BDS), generates continuous labels by smoothing the original binary
ground-truth with respect to pixel-wise uncertainty. The second strategy,
namely Uncertainty-Aware Temperature Scaling (UATS), exploits a relaxed Sigmoid
function during both training and testing with spatially-variant temperature
scaling to produce softened output. Both strategies can be incorporated into
existing deep SOD networks with minimal efforts. Moreover, we propose a new
saliency evaluation metric, namely dense calibration measure C, to measure how
the model is calibrated on a given dataset. Extensive experimental results on
seven benchmark datasets demonstrate that our solutions can not only better
calibrate SOD models, but also improve the network accuracy.
Related papers
- Trustworthy Intrusion Detection: Confidence Estimation Using Latent Space [7.115540429006041]
This work introduces a novel method for enhancing confidence in anomaly detection in Intrusion Detection Systems (IDS)
By developing a confidence metric derived from latent space representations, we aim to improve the reliability of IDS predictions against cyberattacks.
Applying to the NSL-KDD dataset, our approach focuses on binary classification tasks to effectively distinguish between normal and malicious network activities.
arXiv Detail & Related papers (2024-09-19T08:09:44Z) - A Simple Approach to Improve Single-Model Deep Uncertainty via
Distance-Awareness [33.09831377640498]
We study approaches to improve uncertainty property of a single network, based on a single, deterministic representation.
We propose Spectral-normalized Neural Gaussian Process (SNGP), a simple method that improves the distance-awareness ability of modern DNNs.
On a suite of vision and language understanding benchmarks, SNGP outperforms other single-model approaches in prediction, calibration and out-of-domain detection.
arXiv Detail & Related papers (2022-05-01T05:46:13Z) - CertainNet: Sampling-free Uncertainty Estimation for Object Detection [65.28989536741658]
Estimating the uncertainty of a neural network plays a fundamental role in safety-critical settings.
In this work, we propose a novel sampling-free uncertainty estimation method for object detection.
We call it CertainNet, and it is the first to provide separate uncertainties for each output signal: objectness, class, location and size.
arXiv Detail & Related papers (2021-10-04T17:59:31Z) - PDC-Net+: Enhanced Probabilistic Dense Correspondence Network [161.76275845530964]
Enhanced Probabilistic Dense Correspondence Network, PDC-Net+, capable of estimating accurate dense correspondences.
We develop an architecture and an enhanced training strategy tailored for robust and generalizable uncertainty prediction.
Our approach obtains state-of-the-art results on multiple challenging geometric matching and optical flow datasets.
arXiv Detail & Related papers (2021-09-28T17:56:41Z) - Gradient-Based Quantification of Epistemic Uncertainty for Deep Object
Detectors [8.029049649310213]
We introduce novel gradient-based uncertainty metrics and investigate them for different object detection architectures.
Experiments show significant improvements in true positive / false positive discrimination and prediction of intersection over union.
We also find improvement over Monte-Carlo dropout uncertainty metrics and further significant boosts by aggregating different sources of uncertainty metrics.
arXiv Detail & Related papers (2021-07-09T16:04:11Z) - Improving Uncertainty Calibration of Deep Neural Networks via Truth
Discovery and Geometric Optimization [22.57474734944132]
We propose a truth discovery framework to integrate ensemble-based and post-hoc calibration methods.
On large-scale datasets including CIFAR and ImageNet, our method shows consistent improvement against state-of-the-art calibration approaches.
arXiv Detail & Related papers (2021-06-25T06:44:16Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.