Optimized Loss Functions for Object detection: A Case Study on Nighttime
Vehicle Detection
- URL: http://arxiv.org/abs/2011.05523v2
- Date: Wed, 18 Nov 2020 10:34:48 GMT
- Title: Optimized Loss Functions for Object detection: A Case Study on Nighttime
Vehicle Detection
- Authors: Shang Jiang, Haoran Qin, Bingli Zhang, Jieyu Zheng
- Abstract summary: In this paper, we optimize both two loss functions for classification and localization simultaneously.
Compared to the existing studies, in which the correlation is only applied to improve the localization accuracy for positive samples, this paper utilizes the correlation to obtain the really hard negative samples.
A novel localization loss named MIoU is proposed by incorporating a Mahalanobis distance between predicted box and target box, which eliminate the gradients inconsistency problem in the DIoU loss.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Loss functions is a crucial factor that affecting the detection precision in
object detection task. In this paper, we optimize both two loss functions for
classification and localization simultaneously. Firstly, by multiplying an
IoU-based coefficient by the standard cross entropy loss in classification loss
function, the correlation between localization and classification is
established. Compared to the existing studies, in which the correlation is only
applied to improve the localization accuracy for positive samples, this paper
utilizes the correlation to obtain the really hard negative samples and aims to
decrease the misclassified rate for negative samples. Besides, a novel
localization loss named MIoU is proposed by incorporating a Mahalanobis
distance between predicted box and target box, which eliminate the gradients
inconsistency problem in the DIoU loss, further improving the localization
accuracy. Finally, sufficient experiments for nighttime vehicle detection have
been done on two datasets. Our results show than when train with the proposed
loss functions, the detection performance can be outstandingly improved. The
source code and trained models are available at
https://github.com/therebellll/NegIoU-PosIoU-Miou.
Related papers
- YOLO-ELA: Efficient Local Attention Modeling for High-Performance Real-Time Insulator Defect Detection [0.0]
Existing detection methods for insulator defect identification from unmanned aerial vehicles struggle with complex background scenes and small objects.
This paper proposes a new attention-based foundation architecture, YOLO-ELA, to address this issue.
Experimental results on high-resolution UAV images show that our method achieved a state-of-the-art performance of 96.9% mAP0.5 and a real-time detection speed of 74.63 frames per second.
arXiv Detail & Related papers (2024-10-15T16:00:01Z) - Binary Losses for Density Ratio Estimation [2.512309434783062]
Estimating the ratio of two probability densities is a central problem in machine learning and statistics.
We provide a simple recipe for constructing loss functions with certain properties, such as loss functions that prioritize an accurate estimation of large values.
This contrasts with classical loss functions, such as the logistic loss or boosting loss, which prioritize accurate estimation of small values.
arXiv Detail & Related papers (2024-07-01T15:24:34Z) - Domain Adaptive Synapse Detection with Weak Point Annotations [63.97144211520869]
We present AdaSyn, a framework for domain adaptive synapse detection with weak point annotations.
In the WASPSYN challenge at I SBI 2023, our method ranks the 1st place.
arXiv Detail & Related papers (2023-08-31T05:05:53Z) - Bridging Precision and Confidence: A Train-Time Loss for Calibrating
Object Detection [58.789823426981044]
We propose a novel auxiliary loss formulation that aims to align the class confidence of bounding boxes with the accurateness of predictions.
Our results reveal that our train-time loss surpasses strong calibration baselines in reducing calibration error for both in and out-domain scenarios.
arXiv Detail & Related papers (2023-03-25T08:56:21Z) - Adaptive Self-supervision Algorithms for Physics-informed Neural
Networks [59.822151945132525]
Physics-informed neural networks (PINNs) incorporate physical knowledge from the problem domain as a soft constraint on the loss function.
We study the impact of the location of the collocation points on the trainability of these models.
We propose a novel adaptive collocation scheme which progressively allocates more collocation points to areas where the model is making higher errors.
arXiv Detail & Related papers (2022-07-08T18:17:06Z) - SIoU Loss: More Powerful Learning for Bounding Box Regression [0.0]
Loss function SIoU was suggested, where penalty metrics were redefined considering the angle of the vector between the desired regression.
Applied to conventional Neural Networks and datasets it is shown that SIoU improves both the speed of training and the accuracy of the inference.
arXiv Detail & Related papers (2022-05-25T12:46:21Z) - The KFIoU Loss for Rotated Object Detection [115.334070064346]
In this paper, we argue that one effective alternative is to devise an approximate loss who can achieve trend-level alignment with SkewIoU loss.
Specifically, we model the objects as Gaussian distribution and adopt Kalman filter to inherently mimic the mechanism of SkewIoU.
The resulting new loss called KFIoU is easier to implement and works better compared with exact SkewIoU.
arXiv Detail & Related papers (2022-01-29T10:54:57Z) - Reconcile Prediction Consistency for Balanced Object Detection [10.61438063305309]
We propose a Harmonic loss to harmonize the optimization of classification branch and localization branch.
The Harmonic loss enables these two branches to supervise and promote each other during training.
In order to prevent the localization loss from being dominated by outliers during training phase, a Harmonic IoU loss is proposed to harmonize the weight of the localization loss of different IoU-level samples.
arXiv Detail & Related papers (2021-08-24T15:52:11Z) - Loss Function Discovery for Object Detection via Convergence-Simulation
Driven Search [101.73248560009124]
We propose an effective convergence-simulation driven evolutionary search algorithm, CSE-Autoloss, for speeding up the search progress.
We conduct extensive evaluations of loss function search on popular detectors and validate the good generalization capability of searched losses.
Our experiments show that the best-discovered loss function combinations outperform default combinations by 1.1% and 0.8% in terms of mAP for two-stage and one-stage detectors.
arXiv Detail & Related papers (2021-02-09T08:34:52Z) - Learning a Unified Sample Weighting Network for Object Detection [113.98404690619982]
Region sampling or weighting is significantly important to the success of modern region-based object detectors.
We argue that sample weighting should be data-dependent and task-dependent.
We propose a unified sample weighting network to predict a sample's task weights.
arXiv Detail & Related papers (2020-06-11T16:19:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.