Estimating and Exploiting the Aleatoric Uncertainty in Surface Normal
Estimation
- URL: http://arxiv.org/abs/2109.09881v1
- Date: Mon, 20 Sep 2021 23:30:04 GMT
- Title: Estimating and Exploiting the Aleatoric Uncertainty in Surface Normal
Estimation
- Authors: Gwangbin Bae, Ignas Budvytis, Roberto Cipolla
- Abstract summary: Surface normal estimation from a single image is an important task in 3D scene understanding.
In this paper, we address two limitations shared by the existing methods: the inability to estimate the aleatoric uncertainty and lack of detail in the prediction.
We present a novel decoder framework where pixel-wise perceptrons are trained on a subset of pixels sampled based on the estimated uncertainty.
- Score: 25.003116148843525
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Surface normal estimation from a single image is an important task in 3D
scene understanding. In this paper, we address two limitations shared by the
existing methods: the inability to estimate the aleatoric uncertainty and lack
of detail in the prediction. The proposed network estimates the per-pixel
surface normal probability distribution. We introduce a new parameterization
for the distribution, such that its negative log-likelihood is the angular loss
with learned attenuation. The expected value of the angular error is then used
as a measure of the aleatoric uncertainty. We also present a novel decoder
framework where pixel-wise multi-layer perceptrons are trained on a subset of
pixels sampled based on the estimated uncertainty. The proposed
uncertainty-guided sampling prevents the bias in training towards large planar
surfaces and improves the quality of prediction, especially near object
boundaries and on small structures. Experimental results show that the proposed
method outperforms the state-of-the-art in ScanNet and NYUv2, and that the
estimated uncertainty correlates well with the prediction error. Code is
available at https://github.com/baegwangbin/surface_normal_uncertainty.
Related papers
- One step closer to unbiased aleatoric uncertainty estimation [71.55174353766289]
We propose a new estimation method by actively de-noising the observed data.
By conducting a broad range of experiments, we demonstrate that our proposed approach provides a much closer approximation to the actual data uncertainty than the standard method.
arXiv Detail & Related papers (2023-12-16T14:59:11Z) - Discretization-Induced Dirichlet Posterior for Robust Uncertainty
Quantification on Regression [17.49026509916207]
Uncertainty quantification is critical for deploying deep neural networks (DNNs) in real-world applications.
For vision regression tasks, current AuxUE designs are mainly adopted for aleatoric uncertainty estimates.
We propose a generalized AuxUE scheme for more robust uncertainty quantification on regression tasks.
arXiv Detail & Related papers (2023-08-17T15:54:11Z) - NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural
Networks [151.03112356092575]
We show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution.
We demonstrate the strong performance of the method in uncertainty estimation tasks on a variety of real-world image datasets.
arXiv Detail & Related papers (2022-02-07T12:30:45Z) - Dense Uncertainty Estimation via an Ensemble-based Conditional Latent
Variable Model [68.34559610536614]
We argue that the aleatoric uncertainty is an inherent attribute of the data and can only be correctly estimated with an unbiased oracle model.
We propose a new sampling and selection strategy at train time to approximate the oracle model for aleatoric uncertainty estimation.
Our results show that our solution achieves both accurate deterministic results and reliable uncertainty estimation.
arXiv Detail & Related papers (2021-11-22T08:54:10Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - CertainNet: Sampling-free Uncertainty Estimation for Object Detection [65.28989536741658]
Estimating the uncertainty of a neural network plays a fundamental role in safety-critical settings.
In this work, we propose a novel sampling-free uncertainty estimation method for object detection.
We call it CertainNet, and it is the first to provide separate uncertainties for each output signal: objectness, class, location and size.
arXiv Detail & Related papers (2021-10-04T17:59:31Z) - Uncertainty Intervals for Graph-based Spatio-Temporal Traffic Prediction [0.0]
We propose a Spatio-Temporal neural network that is trained to estimate a density given the measurements of previous timesteps, conditioned on a quantile.
Our method of density estimation is fully parameterised by our neural network and does not use a likelihood approximation internally.
This approach produces uncertainty estimates without the need to sample during inference, such as in Monte Carlo Dropout.
arXiv Detail & Related papers (2020-12-09T18:02:26Z) - Learnable Uncertainty under Laplace Approximations [65.24701908364383]
We develop a formalism to explicitly "train" the uncertainty in a decoupled way to the prediction itself.
We show that such units can be trained via an uncertainty-aware objective, improving standard Laplace approximations' performance.
arXiv Detail & Related papers (2020-10-06T13:43:33Z) - Towards Better Performance and More Explainable Uncertainty for 3D
Object Detection of Autonomous Vehicles [33.0319422469465]
We propose a novel form of the loss function to increase the performance of LiDAR-based 3d object detection.
With the new loss function, the performance of our method on the val split of KITTI dataset shows up to a 15% increase in terms of Average Precision.
arXiv Detail & Related papers (2020-06-22T05:49:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.