Prediction Surface Uncertainty Quantification in Object Detection Models
for Autonomous Driving
- URL: http://arxiv.org/abs/2107.04991v1
- Date: Sun, 11 Jul 2021 08:31:15 GMT
- Title: Prediction Surface Uncertainty Quantification in Object Detection Models
for Autonomous Driving
- Authors: Ferhat Ozgur Catak, Tao Yue, Shaukat Ali
- Abstract summary: Object detection in autonomous cars is commonly based on camera images and Lidar inputs, which are often used to train prediction models.
We propose a novel method called PURE (Prediction sURface uncErtainty) for measuring prediction uncertainty of such regression models.
- Score: 5.784950275336468
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Object detection in autonomous cars is commonly based on camera images and
Lidar inputs, which are often used to train prediction models such as deep
artificial neural networks for decision making for object recognition,
adjusting speed, etc. A mistake in such decision making can be damaging; thus,
it is vital to measure the reliability of decisions made by such prediction
models via uncertainty measurement. Uncertainty, in deep learning models, is
often measured for classification problems. However, deep learning models in
autonomous driving are often multi-output regression models. Hence, we propose
a novel method called PURE (Prediction sURface uncErtainty) for measuring
prediction uncertainty of such regression models. We formulate the object
recognition problem as a regression model with more than one outputs for
finding object locations in a 2-dimensional camera view. For evaluation, we
modified three widely-applied object recognition models (i.e., YoLo, SSD300 and
SSD512) and used the KITTI, Stanford Cars, Berkeley DeepDrive, and NEXET
datasets. Results showed the statistically significant negative correlation
between prediction surface uncertainty and prediction accuracy suggesting that
uncertainty significantly impacts the decisions made by autonomous driving.
Related papers
- Uncertainty Estimation for 3D Object Detection via Evidential Learning [63.61283174146648]
We introduce a framework for quantifying uncertainty in 3D object detection by leveraging an evidential learning loss on Bird's Eye View representations in the 3D detector.
We demonstrate both the efficacy and importance of these uncertainty estimates on identifying out-of-distribution scenes, poorly localized objects, and missing (false negative) detections.
arXiv Detail & Related papers (2024-10-31T13:13:32Z) - Estimating Uncertainty with Implicit Quantile Network [0.0]
Uncertainty quantification is an important part of many performance critical applications.
This paper provides a simple alternative to existing approaches such as ensemble learning and bayesian neural networks.
arXiv Detail & Related papers (2024-08-26T13:33:14Z) - Uncertainty-Aware AB3DMOT by Variational 3D Object Detection [74.8441634948334]
Uncertainty estimation is an effective tool to provide statistically accurate predictions.
In this paper, we propose a Variational Neural Network-based TANet 3D object detector to generate 3D object detections with uncertainty.
arXiv Detail & Related papers (2023-02-12T14:30:03Z) - Interpretable Self-Aware Neural Networks for Robust Trajectory
Prediction [50.79827516897913]
We introduce an interpretable paradigm for trajectory prediction that distributes the uncertainty among semantic concepts.
We validate our approach on real-world autonomous driving data, demonstrating superior performance over state-of-the-art baselines.
arXiv Detail & Related papers (2022-11-16T06:28:20Z) - Autoregressive Uncertainty Modeling for 3D Bounding Box Prediction [63.3021778885906]
3D bounding boxes are a widespread intermediate representation in many computer vision applications.
We propose methods for leveraging our autoregressive model to make high confidence predictions and meaningful uncertainty measures.
We release a simulated dataset, COB-3D, which highlights new types of ambiguity that arise in real-world robotics applications.
arXiv Detail & Related papers (2022-10-13T23:57:40Z) - Probabilistic model-error assessment of deep learning proxies: an
application to real-time inversion of borehole electromagnetic measurements [0.0]
We study the effects of the approximate nature of the deep learned models and associated model errors during the inversion of extra-deep borehole electromagnetic (EM) measurements.
Using a deep neural network (DNN) as a forward model allows us to perform thousands of model evaluations within seconds.
We present numerical results highlighting the challenges associated with the inversion of EM measurements while neglecting model error.
arXiv Detail & Related papers (2022-05-25T11:44:48Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - MDN-VO: Estimating Visual Odometry with Confidence [34.8860186009308]
Visual Odometry (VO) is used in many applications including robotics and autonomous systems.
We propose a deep learning-based VO model to estimate 6-DoF poses, as well as a confidence model for these estimates.
Our experiments show that the proposed model exceeds state-of-the-art performance in addition to detecting failure cases.
arXiv Detail & Related papers (2021-12-23T19:26:04Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Monte Carlo DropBlock for Modelling Uncertainty in Object Detection [4.406418914680961]
In this work, we propose an efficient and effective approach to model uncertainty in object detection and segmentation tasks.
The proposed approach applies drop-block during training time and test time on the convolutional layer of the deep learning models such as YOLO.
arXiv Detail & Related papers (2021-08-08T11:34:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.