Rethinking Pseudo Labels for Semi-Supervised Object Detection
- URL: http://arxiv.org/abs/2106.00168v1
- Date: Tue, 1 Jun 2021 01:32:03 GMT
- Title: Rethinking Pseudo Labels for Semi-Supervised Object Detection
- Authors: Hengduo Li, Zuxuan Wu, Abhinav Shrivastava, Larry S. Davis
- Abstract summary: We introduce certainty-aware pseudo labels tailored for object detection.
We dynamically adjust the thresholds used to generate pseudo labels and reweight loss functions for each category to alleviate the class imbalance problem.
Our approach improves supervised baselines by up to 10% AP using only 1-10% labeled data from COCO.
- Score: 84.697097472401
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances in semi-supervised object detection (SSOD) are largely driven
by consistency-based pseudo-labeling methods for image classification tasks,
producing pseudo labels as supervisory signals. However, when using pseudo
labels, there is a lack of consideration in localization precision and
amplified class imbalance, both of which are critical for detection tasks. In
this paper, we introduce certainty-aware pseudo labels tailored for object
detection, which can effectively estimate the classification and localization
quality of derived pseudo labels. This is achieved by converting conventional
localization as a classification task followed by refinement. Conditioned on
classification and localization quality scores, we dynamically adjust the
thresholds used to generate pseudo labels and reweight loss functions for each
category to alleviate the class imbalance problem. Extensive experiments
demonstrate that our method improves state-of-the-art SSOD performance by 1-2%
and 4-6% AP on COCO and PASCAL VOC, respectively. In the limited-annotation
regime, our approach improves supervised baselines by up to 10% AP using only
1-10% labeled data from COCO.
Related papers
- Towards Adaptive Pseudo-label Learning for Semi-Supervised Temporal Action Localization [10.233225586034665]
Existing methods often filter pseudo labels based on strict conditions, leading to suboptimal pseudo-label ranking and selection.
We propose a novel Adaptive Pseudo-label Learning framework to facilitate better pseudo-label selection.
Our method achieves state-of-the-art performance under various semi-supervised settings.
arXiv Detail & Related papers (2024-07-10T14:00:19Z) - LayerMatch: Do Pseudo-labels Benefit All Layers? [77.59625180366115]
Semi-supervised learning offers a promising solution to mitigate the dependency of labeled data.
We develop two layer-specific pseudo-label strategies, termed Grad-ReLU and Avg-Clustering.
Our approach consistently demonstrates exceptional performance on standard semi-supervised learning benchmarks.
arXiv Detail & Related papers (2024-06-20T11:25:50Z) - Revisiting Class Imbalance for End-to-end Semi-Supervised Object
Detection [1.6249267147413524]
Semi-supervised object detection (SSOD) has made significant progress with the development of pseudo-label-based end-to-end methods.
Many methods face challenges due to class imbalance, which hinders the effectiveness of the pseudo-label generator.
In this paper, we examine the root causes of low-quality pseudo-labels and present novel learning mechanisms to improve the label generation quality.
arXiv Detail & Related papers (2023-06-04T06:01:53Z) - Ambiguity-Resistant Semi-Supervised Learning for Dense Object Detection [98.66771688028426]
We propose a Ambiguity-Resistant Semi-supervised Learning (ARSL) for one-stage detectors.
Joint-Confidence Estimation (JCE) is proposed to quantifies the classification and localization quality of pseudo labels.
ARSL effectively mitigates the ambiguities and achieves state-of-the-art SSOD performance on MS COCO and PASCAL VOC.
arXiv Detail & Related papers (2023-03-27T07:46:58Z) - Refined Pseudo labeling for Source-free Domain Adaptive Object Detection [9.705172026751294]
Source-freeD is proposed to adapt source-trained detectors to target domains with only unlabeled target data.
Existing source-freeD methods typically utilize pseudo labeling, where the performance heavily relies on the selection of confidence threshold.
We present a category-aware adaptive threshold estimation module, which adaptively provides the appropriate threshold for each category.
arXiv Detail & Related papers (2023-03-07T08:31:42Z) - Exploiting Completeness and Uncertainty of Pseudo Labels for Weakly
Supervised Video Anomaly Detection [149.23913018423022]
Weakly supervised video anomaly detection aims to identify abnormal events in videos using only video-level labels.
Two-stage self-training methods have achieved significant improvements by self-generating pseudo labels.
We propose an enhancement framework by exploiting completeness and uncertainty properties for effective self-training.
arXiv Detail & Related papers (2022-12-08T05:53:53Z) - PseCo: Pseudo Labeling and Consistency Training for Semi-Supervised
Object Detection [42.75316070378037]
We propose Noisy Pseudo box Learning (NPL) that includes Prediction-guided Label Assignment (PLA) and Positive-proposal Consistency Voting (PCV)
On benchmark, our method, PSEudo labeling and COnsistency training (PseCo), outperforms the SOTA (Soft Teacher) by 2.0, 1.8, 2.0 points under 1%, 5%, and 10% labelling ratios.
arXiv Detail & Related papers (2022-03-30T13:59:22Z) - S3: Supervised Self-supervised Learning under Label Noise [53.02249460567745]
In this paper we address the problem of classification in the presence of label noise.
In the heart of our method is a sample selection mechanism that relies on the consistency between the annotated label of a sample and the distribution of the labels in its neighborhood in the feature space.
Our method significantly surpasses previous methods on both CIFARCIFAR100 with artificial noise and real-world noisy datasets such as WebVision and ANIMAL-10N.
arXiv Detail & Related papers (2021-11-22T15:49:20Z) - Semi-supervised Relation Extraction via Incremental Meta Self-Training [56.633441255756075]
Semi-Supervised Relation Extraction methods aim to leverage unlabeled data in addition to learning from limited samples.
Existing self-training methods suffer from the gradual drift problem, where noisy pseudo labels on unlabeled data are incorporated during training.
We propose a method called MetaSRE, where a Relation Label Generation Network generates quality assessment on pseudo labels by (meta) learning from the successful and failed attempts on Relation Classification Network as an additional meta-objective.
arXiv Detail & Related papers (2020-10-06T03:54:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.