Semi-Supervised Object Detection with Adaptive Class-Rebalancing
Self-Training
- URL: http://arxiv.org/abs/2107.05031v1
- Date: Sun, 11 Jul 2021 12:14:42 GMT
- Title: Semi-Supervised Object Detection with Adaptive Class-Rebalancing
Self-Training
- Authors: Fangyuan Zhang, Tianxiang Pan, Bin Wang
- Abstract summary: This study delves into semi-supervised object detection to improve detector performance with additional unlabeled data.
We propose a novel two-stage filtering algorithm to generate accurate pseudo-labels.
Our method achieves satisfactory improvements on MS-COCO and VOC benchmarks.
- Score: 5.874575666947381
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study delves into semi-supervised object detection (SSOD) to improve
detector performance with additional unlabeled data. State-of-the-art SSOD
performance has been achieved recently by self-training, in which training
supervision consists of ground truths and pseudo-labels. In current studies, we
observe that class imbalance in SSOD severely impedes the effectiveness of
self-training. To address the class imbalance, we propose adaptive
class-rebalancing self-training (ACRST) with a novel memory module called
CropBank. ACRST adaptively rebalances the training data with foreground
instances extracted from the CropBank, thereby alleviating the class imbalance.
Owing to the high complexity of detection tasks, we observe that both
self-training and data-rebalancing suffer from noisy pseudo-labels in SSOD.
Therefore, we propose a novel two-stage filtering algorithm to generate
accurate pseudo-labels. Our method achieves satisfactory improvements on
MS-COCO and VOC benchmarks. When using only 1\% labeled data in MS-COCO, our
method achieves 17.02 mAP improvement over supervised baselines, and 5.32 mAP
improvement compared with state-of-the-art methods.
Related papers
- Co-training for Low Resource Scientific Natural Language Inference [65.37685198688538]
We propose a novel co-training method that assigns weights based on the training dynamics of the classifiers to the distantly supervised labels.
By assigning importance weights instead of filtering out examples based on an arbitrary threshold on the predicted confidence, we maximize the usage of automatically labeled data.
The proposed method obtains an improvement of 1.5% in Macro F1 over the distant supervision baseline, and substantial improvements over several other strong SSL baselines.
arXiv Detail & Related papers (2024-06-20T18:35:47Z) - Adaptive Rentention & Correction for Continual Learning [114.5656325514408]
A common problem in continual learning is the classification layer's bias towards the most recent task.
We name our approach Adaptive Retention & Correction (ARC)
ARC achieves an average performance increase of 2.7% and 2.6% on the CIFAR-100 and Imagenet-R datasets.
arXiv Detail & Related papers (2024-05-23T08:43:09Z) - Incremental Self-training for Semi-supervised Learning [56.57057576885672]
IST is simple yet effective and fits existing self-training-based semi-supervised learning methods.
We verify the proposed IST on five datasets and two types of backbone, effectively improving the recognition accuracy and learning speed.
arXiv Detail & Related papers (2024-04-14T05:02:00Z) - Gradient-based Sampling for Class Imbalanced Semi-supervised Object Detection [111.0991686509715]
We study the class imbalance problem for semi-supervised object detection (SSOD) under more challenging scenarios.
We propose a simple yet effective gradient-based sampling framework that tackles the class imbalance problem from the perspective of two types of confirmation biases.
Experiments on three proposed sub-tasks, namely MS-COCO, MS-COCO to Object365 and LVIS, suggest that our method outperforms current class imbalanced object detectors by clear margins.
arXiv Detail & Related papers (2024-03-22T11:30:10Z) - Pseudo-label Correction and Learning For Semi-Supervised Object
Detection [34.030270359567204]
We propose two strategies, namely pseudo-label correction and noise-unaware learning.
For pseudo-label correction, we introduce a multi-round refining method and a multi-vote weighting method.
For noise-unaware learning, we introduce a loss weight function that is negatively correlated with the Intersection over Union (IoU) in the regression task.
arXiv Detail & Related papers (2023-03-06T09:54:15Z) - Revisiting Pretraining for Semi-Supervised Learning in the Low-Label
Regime [15.863530936691157]
Semi-supervised learning (SSL) addresses the lack of labeled data by exploiting large unlabeled data through pseudolabeling.
Recent studies combined finetuning (FT) from pretrained weights with SSL to mitigate the challenges and claimed superior results in the low-label regime.
arXiv Detail & Related papers (2022-05-06T03:53:25Z) - Rethinking Pseudo Labels for Semi-Supervised Object Detection [84.697097472401]
We introduce certainty-aware pseudo labels tailored for object detection.
We dynamically adjust the thresholds used to generate pseudo labels and reweight loss functions for each category to alleviate the class imbalance problem.
Our approach improves supervised baselines by up to 10% AP using only 1-10% labeled data from COCO.
arXiv Detail & Related papers (2021-06-01T01:32:03Z) - Unbiased Teacher for Semi-Supervised Object Detection [50.0087227400306]
We revisit the Semi-Supervised Object Detection (SS-OD) and identify the pseudo-labeling bias issue in SS-OD.
We introduce Unbiased Teacher, a simple yet effective approach that jointly trains a student and a gradually progressing teacher in a mutually-beneficial manner.
arXiv Detail & Related papers (2021-02-18T17:02:57Z) - Semi-supervised ASR by End-to-end Self-training [18.725686837244265]
We propose a self-training method with an end-to-end system for semi-supervised ASR.
We iteratively generate pseudo-labels on a mini-batch of unsupervised utterances with the current model, and use the pseudo-labels to augment the supervised data for immediate model update.
Our method gives 14.4% relative WER improvement over a carefully-trained base system with data augmentation, reducing the performance gap between the base system and the oracle system by 50%.
arXiv Detail & Related papers (2020-01-24T18:22:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.