Object Detection as Probabilistic Set Prediction
- URL: http://arxiv.org/abs/2203.07980v2
- Date: Wed, 16 Mar 2022 15:54:13 GMT
- Title: Object Detection as Probabilistic Set Prediction
- Authors: Georg Hess, Christoffer Petersson, Lennart Svensson
- Abstract summary: We present a proper scoring rule for evaluating and training probabilistic object detectors.
Our results indicate that the training of existing detectors is optimized toward non-probabilistic metrics.
- Score: 3.7599363231894176
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurate uncertainty estimates are essential for deploying deep object
detectors in safety-critical systems. The development and evaluation of
probabilistic object detectors have been hindered by shortcomings in existing
performance measures, which tend to involve arbitrary thresholds or limit the
detector's choice of distributions. In this work, we propose to view object
detection as a set prediction task where detectors predict the distribution
over the set of objects. Using the negative log-likelihood for random finite
sets, we present a proper scoring rule for evaluating and training
probabilistic object detectors. The proposed method can be applied to existing
probabilistic detectors, is free from thresholds, and enables fair comparison
between architectures. Three different types of detectors are evaluated on the
COCO dataset. Our results indicate that the training of existing detectors is
optimized toward non-probabilistic metrics. We hope to encourage the
development of new object detectors that can accurately estimate their own
uncertainty. Code will be released.
Related papers
- Uncertainty-Aware AB3DMOT by Variational 3D Object Detection [74.8441634948334]
Uncertainty estimation is an effective tool to provide statistically accurate predictions.
In this paper, we propose a Variational Neural Network-based TANet 3D object detector to generate 3D object detections with uncertainty.
arXiv Detail & Related papers (2023-02-12T14:30:03Z) - A Review of Uncertainty Calibration in Pretrained Object Detectors [5.440028715314566]
We investigate the uncertainty calibration properties of different pretrained object detection architectures in a multi-class setting.
We propose a framework to ensure a fair, unbiased, and repeatable evaluation.
We deliver novel insights into why poor detector calibration emerges.
arXiv Detail & Related papers (2022-10-06T14:06:36Z) - GLENet: Boosting 3D Object Detectors with Generative Label Uncertainty Estimation [70.75100533512021]
In this paper, we formulate the label uncertainty problem as the diversity of potentially plausible bounding boxes of objects.
We propose GLENet, a generative framework adapted from conditional variational autoencoders, to model the one-to-many relationship between a typical 3D object and its potential ground-truth bounding boxes with latent variables.
The label uncertainty generated by GLENet is a plug-and-play module and can be conveniently integrated into existing deep 3D detectors.
arXiv Detail & Related papers (2022-07-06T06:26:17Z) - Trajectory Forecasting from Detection with Uncertainty-Aware Motion
Encoding [121.66374635092097]
Trajectories obtained from object detection and tracking are inevitably noisy.
We propose a trajectory predictor directly based on detection results without relying on explicitly formed trajectories.
arXiv Detail & Related papers (2022-02-03T09:09:56Z) - CertainNet: Sampling-free Uncertainty Estimation for Object Detection [65.28989536741658]
Estimating the uncertainty of a neural network plays a fundamental role in safety-critical settings.
In this work, we propose a novel sampling-free uncertainty estimation method for object detection.
We call it CertainNet, and it is the first to provide separate uncertainties for each output signal: objectness, class, location and size.
arXiv Detail & Related papers (2021-10-04T17:59:31Z) - Learning Uncertainty For Safety-Oriented Semantic Segmentation In
Autonomous Driving [77.39239190539871]
We show how uncertainty estimation can be leveraged to enable safety critical image segmentation in autonomous driving.
We introduce a new uncertainty measure based on disagreeing predictions as measured by a dissimilarity function.
We show experimentally that our proposed approach is much less computationally intensive at inference time than competing methods.
arXiv Detail & Related papers (2021-05-28T09:23:05Z) - Estimating and Evaluating Regression Predictive Uncertainty in Deep
Object Detectors [9.273998041238224]
We show that training variance networks with negative log likelihood (NLL) can lead to high entropy predictive distributions.
We propose to use the energy score as a non-local proper scoring rule and find that when used for training, the energy score leads to better calibrated and lower entropy predictive distributions.
arXiv Detail & Related papers (2021-01-13T12:53:54Z) - Labels Are Not Perfect: Inferring Spatial Uncertainty in Object
Detection [26.008419879970365]
In this work, we infer the uncertainty in bounding box labels from LiDAR point clouds based on a generative model.
Comprehensive experiments show that the proposed model reflects complex environmental noises in LiDAR perception and the label quality.
We propose Jaccard IoU as a new evaluation metric that extends IoU by incorporating label uncertainty.
arXiv Detail & Related papers (2020-12-18T09:11:44Z) - Learning to Separate Clusters of Adversarial Representations for Robust
Adversarial Detection [50.03939695025513]
We propose a new probabilistic adversarial detector motivated by a recently introduced non-robust feature.
In this paper, we consider the non-robust features as a common property of adversarial examples, and we deduce it is possible to find a cluster in representation space corresponding to the property.
This idea leads us to probability estimate distribution of adversarial representations in a separate cluster, and leverage the distribution for a likelihood based adversarial detector.
arXiv Detail & Related papers (2020-12-07T07:21:18Z) - A Review and Comparative Study on Probabilistic Object Detection in
Autonomous Driving [14.034548457000884]
Capturing uncertainty in object detection is indispensable for safe autonomous driving.
There is no summary on uncertainty estimation in deep object detection.
This paper provides a review and comparative study on existing probabilistic object detection methods.
arXiv Detail & Related papers (2020-11-20T22:30:36Z) - Inferring Spatial Uncertainty in Object Detection [35.28872968233385]
We propose a generative model to estimate bounding box label uncertainties from LiDAR point clouds.
Comprehensive experiments show that the proposed model represents uncertainties commonly seen in driving scenarios.
We propose an extension of IoU, called the Jaccard IoU (JIoU), as a new evaluation metric that incorporates label uncertainty.
arXiv Detail & Related papers (2020-03-07T19:29:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.