Rethinking Transformer-based Set Prediction for Object Detection
- URL: http://arxiv.org/abs/2011.10881v2
- Date: Tue, 12 Oct 2021 06:09:03 GMT
- Title: Rethinking Transformer-based Set Prediction for Object Detection
- Authors: Zhiqing Sun, Shengcao Cao, Yiming Yang, Kris Kitani
- Abstract summary: Experimental results show that the proposed methods not only converge much faster than the original DETR, but also significantly outperform DETR and other baselines in terms of detection accuracy.
- Score: 57.7208561353529
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: DETR is a recently proposed Transformer-based method which views object
detection as a set prediction problem and achieves state-of-the-art performance
but demands extra-long training time to converge. In this paper, we investigate
the causes of the optimization difficulty in the training of DETR. Our
examinations reveal several factors contributing to the slow convergence of
DETR, primarily the issues with the Hungarian loss and the Transformer
cross-attention mechanism. To overcome these issues we propose two solutions,
namely, TSP-FCOS (Transformer-based Set Prediction with FCOS) and TSP-RCNN
(Transformer-based Set Prediction with RCNN). Experimental results show that
the proposed methods not only converge much faster than the original DETR, but
also significantly outperform DETR and other baselines in terms of detection
accuracy.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.