Balance-Oriented Focal Loss with Linear Scheduling for Anchor Free
Object Detection
- URL: http://arxiv.org/abs/2012.13763v1
- Date: Sat, 26 Dec 2020 15:24:03 GMT
- Title: Balance-Oriented Focal Loss with Linear Scheduling for Anchor Free
Object Detection
- Authors: Hopyong Gil, Sangwoo Park, Yusang Park, Wongoo Han, Juyean Hong,
Juneyoung Jung
- Abstract summary: We propose Balance-oriented focal loss that can induce balanced learning by considering both background and foreground balance.
By improving the focal loss in terms of balancing foreground classes, our method achieves AP gains of +1.2 in MS-COCO for the anchor free real-time detector.
- Score: 1.69146632099647
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most existing object detectors suffer from class imbalance problems that
hinder balanced performance. In particular, anchor free object detectors have
to solve the background imbalance problem due to detection in a per-pixel
prediction fashion as well as foreground imbalance problem simultaneously. In
this work, we propose Balance-oriented focal loss that can induce balanced
learning by considering both background and foreground balance comprehensively.
This work aims to address imbalance problem in the situation of using a general
unbalanced data of non-extreme distribution not including few shot and the
focal loss for anchor free object detector. We use a batch-wise alpha-balanced
variant of the focal loss to deal with this imbalance problem elaborately. It
is a simple and practical solution using only re-weighting for general
unbalanced data. It does require neither additional learning cost nor
structural change during inference and grouping classes is also unnecessary.
Through extensive experiments, we show the performance improvement for each
component and analyze the effect of linear scheduling when using re-weighting
for the loss. By improving the focal loss in terms of balancing foreground
classes, our method achieves AP gains of +1.2 in MS-COCO for the anchor free
real-time detector.
Related papers
- Rethinking the Bias of Foundation Model under Long-tailed Distribution [18.80942166783087]
We find the imbalance biases inherited in foundation models on downstream task as parameter imbalance and data imbalance.
During fine-tuning, we observe that parameter imbalance plays a more critical role, while data imbalance can be mitigated using existing re-balancing strategies.
We propose a novel backdoor adjustment method that learns the true causal effect between input samples and labels.
arXiv Detail & Related papers (2025-01-27T11:00:19Z) - Gradient-based Sampling for Class Imbalanced Semi-supervised Object Detection [111.0991686509715]
We study the class imbalance problem for semi-supervised object detection (SSOD) under more challenging scenarios.
We propose a simple yet effective gradient-based sampling framework that tackles the class imbalance problem from the perspective of two types of confirmation biases.
Experiments on three proposed sub-tasks, namely MS-COCO, MS-COCO to Object365 and LVIS, suggest that our method outperforms current class imbalanced object detectors by clear margins.
arXiv Detail & Related papers (2024-03-22T11:30:10Z) - Class Imbalance in Object Detection: An Experimental Diagnosis and Study
of Mitigation Strategies [0.5439020425818999]
This study introduces a benchmarking framework utilizing the YOLOv5 single-stage detector to address the problem of foreground-foreground class imbalance.
We scrutinized three established techniques: sampling, loss weighing, and data augmentation.
Our comparative analysis reveals that sampling and loss reweighing methods, while shown to be beneficial in two-stage detector settings, do not translate as effectively in improving YOLOv5's performance.
arXiv Detail & Related papers (2024-03-11T19:06:04Z) - Few-shot $\mathbf{1/a}$ Anomalies Feedback : Damage Vision Mining
Opportunity and Embedding Feature Imbalance [0.0]
imbalanced data problems can be categorised into four types: missing range of target and label valuables, majority-minority class imbalance, foreground background of spatial imbalance, and long-tailed class of pixel-wise imbalance.
In this study, we highlight a one-class anomaly detection application, whether anomalous class or not, and demonstrate clear examples of imbalanced vision datasets.
arXiv Detail & Related papers (2023-07-24T10:30:54Z) - Learning to Re-weight Examples with Optimal Transport for Imbalanced
Classification [74.62203971625173]
Imbalanced data pose challenges for deep learning based classification models.
One of the most widely-used approaches for tackling imbalanced data is re-weighting.
We propose a novel re-weighting method based on optimal transport (OT) from a distributional point of view.
arXiv Detail & Related papers (2022-08-05T01:23:54Z) - Phased Progressive Learning with Coupling-Regulation-Imbalance Loss for
Imbalanced Classification [11.673344551762822]
Deep neural networks generally perform poorly with datasets that suffer from quantity imbalance and classification difficulty imbalance between different classes.
A phased progressive learning schedule was proposed for smoothly transferring the training emphasis from representation learning to upper classifier training.
Our code will be open source soon.
arXiv Detail & Related papers (2022-05-24T14:46:39Z) - Neural Collapse Inspired Attraction-Repulsion-Balanced Loss for
Imbalanced Learning [97.81549071978789]
We propose Attraction-Repulsion-Balanced Loss (ARB-Loss) to balance the different components of the gradients.
We perform experiments on the large-scale classification and segmentation datasets and our ARB-Loss can achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-04-19T08:23:23Z) - Scale-Equivalent Distillation for Semi-Supervised Object Detection [57.59525453301374]
Recent Semi-Supervised Object Detection (SS-OD) methods are mainly based on self-training, generating hard pseudo-labels by a teacher model on unlabeled data as supervisory signals.
We analyze the challenges these methods meet with the empirical experiment results.
We introduce a novel approach, Scale-Equivalent Distillation (SED), which is a simple yet effective end-to-end knowledge distillation framework robust to large object size variance and class imbalance.
arXiv Detail & Related papers (2022-03-23T07:33:37Z) - AutoBalance: Optimized Loss Functions for Imbalanced Data [38.64606886588534]
We propose AutoBalance, a bi-level optimization framework that automatically designs a training loss function to optimize a blend of accuracy and fairness-seeking objectives.
Specifically, a lower-level problem trains the model weights, and an upper-level problem tunes the loss function by monitoring and optimizing the desired objective over the validation data.
Our loss design enables personalized treatment for classes/groups by employing a parametric cross-entropy loss and individualized data augmentation schemes.
arXiv Detail & Related papers (2022-01-04T15:53:23Z) - Towards Balanced Learning for Instance Recognition [149.76724446376977]
We propose Libra R-CNN, a framework towards balanced learning for instance recognition.
It integrates IoU-balanced sampling, balanced feature pyramid, and objective re-weighting, respectively for reducing the imbalance at sample, feature, and objective level.
arXiv Detail & Related papers (2021-08-23T13:40:45Z) - Enhanced Principal Component Analysis under A Collaborative-Robust
Framework [89.28334359066258]
We introduce a general collaborative-robust weight learning framework that combines weight learning and robust loss in a non-trivial way.
Under the proposed framework, only a part of well-fitting samples are activated which indicates more importance during training, and others, whose errors are large, will not be ignored.
In particular, the negative effects of inactivated samples are alleviated by the robust loss function.
arXiv Detail & Related papers (2021-03-22T15:17:37Z) - Few-Shot Learning with Class Imbalance [13.60699610822265]
Few-shot learning aims to train models on a limited number of labeled samples given in a support set in order to generalize to unseen samples from a query set.
In the standard setup, the support set contains an equal amount of data points for each class.
We present a detailed study of few-shot class-imbalance along three axes: meta-dataset vs. task imbalance, effect of different imbalance distributions (linear, step, random), and effect of rebalancing techniques.
arXiv Detail & Related papers (2021-01-07T12:54:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.