Uncertainty-Aware CNNs for Depth Completion: Uncertainty from Beginning
to End
- URL: http://arxiv.org/abs/2006.03349v1
- Date: Fri, 5 Jun 2020 10:18:35 GMT
- Title: Uncertainty-Aware CNNs for Depth Completion: Uncertainty from Beginning
to End
- Authors: Abdelrahman Eldesokey, Michael Felsberg, Karl Holmquist, and Mikael
Persson
- Abstract summary: We focus on modeling the uncertainty of depth data in depth completion starting from the sparse noisy input all the way to the final prediction.
We propose a novel approach to identify disturbed measurements in the input by learning an input confidence estimator in a self-supervised manner based on the normalized convolutional neural networks (NCNNs)
When we evaluate our approach on the KITTI dataset for depth completion, we outperform all the existing Bayesian Deep Learning approaches in terms of prediction accuracy, quality of the uncertainty measure, and the computational efficiency.
- Score: 18.49954482336334
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The focus in deep learning research has been mostly to push the limits of
prediction accuracy. However, this was often achieved at the cost of increased
complexity, raising concerns about the interpretability and the reliability of
deep networks. Recently, an increasing attention has been given to untangling
the complexity of deep networks and quantifying their uncertainty for different
computer vision tasks. Differently, the task of depth completion has not
received enough attention despite the inherent noisy nature of depth sensors.
In this work, we thus focus on modeling the uncertainty of depth data in depth
completion starting from the sparse noisy input all the way to the final
prediction.
We propose a novel approach to identify disturbed measurements in the input
by learning an input confidence estimator in a self-supervised manner based on
the normalized convolutional neural networks (NCNNs). Further, we propose a
probabilistic version of NCNNs that produces a statistically meaningful
uncertainty measure for the final prediction. When we evaluate our approach on
the KITTI dataset for depth completion, we outperform all the existing Bayesian
Deep Learning approaches in terms of prediction accuracy, quality of the
uncertainty measure, and the computational efficiency. Moreover, our small
network with 670k parameters performs on-par with conventional approaches with
millions of parameters. These results give strong evidence that separating the
network into parallel uncertainty and prediction streams leads to
state-of-the-art performance with accurate uncertainty estimates.
Related papers
- Confidence Intervals and Simultaneous Confidence Bands Based on Deep Learning [0.36832029288386137]
We provide a valid non-parametric bootstrap method that correctly disentangles data uncertainty from the noise inherent in the adopted optimization algorithm.
The proposed ad-hoc method can be easily integrated into any deep neural network without interfering with the training process.
arXiv Detail & Related papers (2024-06-20T05:51:37Z) - DUDES: Deep Uncertainty Distillation using Ensembles for Semantic
Segmentation [11.099838952805325]
Quantifying the predictive uncertainty is a promising endeavour to open up the use of deep neural networks for such applications.
We present a novel approach for efficient and reliable uncertainty estimation which we call Deep Uncertainty Distillation using Ensembles (DUDES)
DUDES applies student-teacher distillation with a Deep Ensemble to accurately approximate predictive uncertainties with a single forward pass.
arXiv Detail & Related papers (2023-03-17T08:56:27Z) - Uncertainty Estimation by Fisher Information-based Evidential Deep
Learning [61.94125052118442]
Uncertainty estimation is a key factor that makes deep learning reliable in practical applications.
We propose a novel method, Fisher Information-based Evidential Deep Learning ($mathcalI$-EDL)
In particular, we introduce Fisher Information Matrix (FIM) to measure the informativeness of evidence carried by each sample, according to which we can dynamically reweight the objective loss terms to make the network more focused on the representation learning of uncertain classes.
arXiv Detail & Related papers (2023-03-03T16:12:59Z) - Fast Uncertainty Estimates in Deep Learning Interatomic Potentials [0.0]
We propose a method to estimate the predictive uncertainty based on a single neural network without the need for an ensemble.
We demonstrate that the quality of the uncertainty estimates matches those obtained from deep ensembles.
arXiv Detail & Related papers (2022-11-17T20:13:39Z) - The Unreasonable Effectiveness of Deep Evidential Regression [72.30888739450343]
A new approach with uncertainty-aware regression-based neural networks (NNs) shows promise over traditional deterministic methods and typical Bayesian NNs.
We detail the theoretical shortcomings and analyze the performance on synthetic and real-world data sets, showing that Deep Evidential Regression is a quantification rather than an exact uncertainty.
arXiv Detail & Related papers (2022-05-20T10:10:32Z) - Robust Depth Completion with Uncertainty-Driven Loss Functions [60.9237639890582]
We introduce uncertainty-driven loss functions to improve the robustness of depth completion and handle the uncertainty in depth completion.
Our method has been tested on KITTI Depth Completion Benchmark and achieved the state-of-the-art robustness performance in terms of MAE, IMAE, and IRMSE metrics.
arXiv Detail & Related papers (2021-12-15T05:22:34Z) - Uncertainty-Aware Deep Calibrated Salient Object Detection [74.58153220370527]
Existing deep neural network based salient object detection (SOD) methods mainly focus on pursuing high network accuracy.
These methods overlook the gap between network accuracy and prediction confidence, known as the confidence uncalibration problem.
We introduce an uncertaintyaware deep SOD network, and propose two strategies to prevent deep SOD networks from being overconfident.
arXiv Detail & Related papers (2020-12-10T23:28:36Z) - Uncertainty Quantification in Deep Residual Neural Networks [0.0]
Uncertainty quantification is an important and challenging problem in deep learning.
Previous methods rely on dropout layers which are not present in modern deep architectures or batch normalization which is sensitive to batch sizes.
We show that training residual networks using depth can be interpreted as a variational approximation to the posterior weights in neural networks.
arXiv Detail & Related papers (2020-07-09T16:05:37Z) - Depth Uncertainty in Neural Networks [2.6763498831034043]
Existing methods for estimating uncertainty in deep learning tend to require multiple forward passes.
By exploiting the sequential structure of feed-forward networks, we are able to both evaluate our training objective and make predictions with a single forward pass.
We validate our approach on real-world regression and image classification tasks.
arXiv Detail & Related papers (2020-06-15T14:33:40Z) - On the uncertainty of self-supervised monocular depth estimation [52.13311094743952]
Self-supervised paradigms for monocular depth estimation are very appealing since they do not require ground truth annotations at all.
We explore for the first time how to estimate the uncertainty for this task and how this affects depth accuracy.
We propose a novel peculiar technique specifically designed for self-supervised approaches.
arXiv Detail & Related papers (2020-05-13T09:00:55Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.