Improving Localization for Semi-Supervised Object Detection
- URL: http://arxiv.org/abs/2206.10186v1
- Date: Tue, 21 Jun 2022 08:39:38 GMT
- Title: Improving Localization for Semi-Supervised Object Detection
- Authors: Leonardo Rossi, Akbar Karimi, Andrea Prati
- Abstract summary: We introduce an additional classification task for bounding box localization to improve the filtering of predicted bounding boxes.
Our experiments show that our IL-net increases SSOD performance by 1.14% AP on dataset in limited-annotation regime.
- Score: 3.5493798890908104
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Nowadays, Semi-Supervised Object Detection (SSOD) is a hot topic, since,
while it is rather easy to collect images for creating a new dataset, labeling
them is still an expensive and time-consuming task. One of the successful
methods to take advantage of raw images on a Semi-Supervised Learning (SSL)
setting is the Mean Teacher technique, where the operations of pseudo-labeling
by the Teacher and the Knowledge Transfer from the Student to the Teacher take
place simultaneously. However, the pseudo-labeling by thresholding is not the
best solution since the confidence value is not strictly related to the
prediction uncertainty, not permitting to safely filter predictions. In this
paper, we introduce an additional classification task for bounding box
localization to improve the filtering of the predicted bounding boxes and
obtain higher quality on Student training. Furthermore, we empirically prove
that bounding box regression on the unsupervised part can equally contribute to
the training as much as category classification. Our experiments show that our
IL-net (Improving Localization net) increases SSOD performance by 1.14% AP on
COCO dataset in limited-annotation regime. The code is available at
https://github.com/IMPLabUniPr/unbiased-teacher/tree/ilnet
Related papers
- Context Matters: Leveraging Spatiotemporal Metadata for Semi-Supervised Learning on Remote Sensing Images [2.518656729567209]
Current approaches generate pseudo-labels from model predictions for unlabeled samples.
We propose exploiting totemporal metainformation in SSL to improve the quality of pseudo-labels.
We show that adding the available metadata to the input of the predictor at test time degenerates the prediction quality for metadata outside thetemporal distribution of the training set.
arXiv Detail & Related papers (2024-04-29T10:47:37Z) - Cyclic-Bootstrap Labeling for Weakly Supervised Object Detection [134.05510658882278]
Cyclic-Bootstrap Labeling (CBL) is a novel weakly supervised object detection pipeline.
Uses a weighted exponential moving average strategy to take advantage of various refinement modules.
A novel class-specific ranking distillation algorithm is proposed to leverage the output of weighted ensembled teacher network.
arXiv Detail & Related papers (2023-08-11T07:57:17Z) - Hierarchical Supervision and Shuffle Data Augmentation for 3D
Semi-Supervised Object Detection [90.32180043449263]
State-of-the-art 3D object detectors are usually trained on large-scale datasets with high-quality 3D annotations.
A natural remedy is to adopt semi-supervised learning (SSL) by leveraging a limited amount of labeled samples and abundant unlabeled samples.
This paper introduces a novel approach of Hierarchical Supervision and Shuffle Data Augmentation (HSSDA), which is a simple yet effective teacher-student framework.
arXiv Detail & Related papers (2023-04-04T02:09:32Z) - Adaptive Self-Training for Object Detection [13.07105239116411]
We introduce our method Self-Training for Object Detection (ASTOD)
ASTOD determines without cost a threshold value based directly on the ground value of the score histogram.
We use different views of the unlabeled images during the pseudo-labeling step to reduce the number of missed predictions.
arXiv Detail & Related papers (2022-12-07T15:10:40Z) - Semi-Supervised Object Detection with Object-wise Contrastive Learning
and Regression Uncertainty [46.21528260727673]
We propose a two-step pseudo-label filtering for the classification and regression heads in a teacher-student framework.
By jointly filtering the pseudo-labels for the classification and regression heads, the student network receives better guidance from the teacher network for object detection task.
arXiv Detail & Related papers (2022-12-06T04:37:51Z) - Label Matching Semi-Supervised Object Detection [85.99282969977541]
Semi-supervised object detection has made significant progress with the development of mean teacher driven self-training.
Label mismatch problem is not yet fully explored in the previous works, leading to severe confirmation bias during self-training.
We propose a simple yet effective LabelMatch framework from two different yet complementary perspectives.
arXiv Detail & Related papers (2022-06-14T05:59:41Z) - Boosting Weakly Supervised Object Detection via Learning Bounding Box
Adjusters [76.36104006511684]
Weakly-supervised object detection (WSOD) has emerged as an inspiring recent topic to avoid expensive instance-level object annotations.
We defend the problem setting for improving localization performance by leveraging the bounding box regression knowledge from a well-annotated auxiliary dataset.
Our method performs favorably against state-of-the-art WSOD methods and knowledge transfer model with similar problem setting.
arXiv Detail & Related papers (2021-08-03T13:38:20Z) - Unbiased Teacher for Semi-Supervised Object Detection [50.0087227400306]
We revisit the Semi-Supervised Object Detection (SS-OD) and identify the pseudo-labeling bias issue in SS-OD.
We introduce Unbiased Teacher, a simple yet effective approach that jointly trains a student and a gradually progressing teacher in a mutually-beneficial manner.
arXiv Detail & Related papers (2021-02-18T17:02:57Z) - Iterative label cleaning for transductive and semi-supervised few-shot
learning [16.627512688664513]
Few-shot learning amounts to learning representations and acquiring knowledge such that novel tasks may be solved with both supervision and data being limited.
We introduce a new algorithm that leverages the manifold structure of the labeled and unlabeled data distribution to predict pseudo-labels.
Our solution surpasses or matches the state of the art results on four benchmark datasets.
arXiv Detail & Related papers (2020-12-14T21:54:11Z) - Deep Semi-supervised Knowledge Distillation for Overlapping Cervical
Cell Instance Segmentation [54.49894381464853]
We propose to leverage both labeled and unlabeled data for instance segmentation with improved accuracy by knowledge distillation.
We propose a novel Mask-guided Mean Teacher framework with Perturbation-sensitive Sample Mining.
Experiments show that the proposed method improves the performance significantly compared with the supervised method learned from labeled data only.
arXiv Detail & Related papers (2020-07-21T13:27:09Z) - Task-Adaptive Clustering for Semi-Supervised Few-Shot Classification [23.913195015484696]
Few-shot learning aims to handle previously unseen tasks using only a small amount of new training data.
In preparing (or meta-training) a few-shot learner, however, massive labeled data are necessary.
In this work, we propose a few-shot learner that can work well under the semi-supervised setting where a large portion of training data is unlabeled.
arXiv Detail & Related papers (2020-03-18T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.