SIoU Loss: More Powerful Learning for Bounding Box Regression
- URL: http://arxiv.org/abs/2205.12740v1
- Date: Wed, 25 May 2022 12:46:21 GMT
- Title: SIoU Loss: More Powerful Learning for Bounding Box Regression
- Authors: Zhora Gevorgyan
- Abstract summary: Loss function SIoU was suggested, where penalty metrics were redefined considering the angle of the vector between the desired regression.
Applied to conventional Neural Networks and datasets it is shown that SIoU improves both the speed of training and the accuracy of the inference.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The effectiveness of Object Detection, one of the central problems in
computer vision tasks, highly depends on the definition of the loss function -
a measure of how accurately your ML model can predict the expected outcome.
Conventional object detection loss functions depend on aggregation of metrics
of bounding box regression such as the distance, overlap area and aspect ratio
of the predicted and ground truth boxes (i.e. GIoU, CIoU, ICIoU etc). However,
none of the methods proposed and used to date considers the direction of the
mismatch between the desired ground box and the predicted, "experimental" box.
This shortage results in slower and less effective convergence as the predicted
box can "wander around" during the training process and eventually end up
producing a worse model. In this paper a new loss function SIoU was suggested,
where penalty metrics were redefined considering the angle of the vector
between the desired regression. Applied to conventional Neural Networks and
datasets it is shown that SIoU improves both the speed of training and the
accuracy of the inference. The effectiveness of the proposed loss function was
revealed in a number of simulations and tests.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.