Calibrated Teacher for Sparsely Annotated Object Detection
- URL: http://arxiv.org/abs/2303.07582v1
- Date: Tue, 14 Mar 2023 02:02:39 GMT
- Title: Calibrated Teacher for Sparsely Annotated Object Detection
- Authors: Haohan Wang, Liang Liu, Boshen Zhang, Jiangning Zhang, Wuhao Zhang,
Zhenye Gan, Yabiao Wang, Chengjie Wang, Haoqian Wang
- Abstract summary: Fully supervised object detection requires training images in which all instances are annotated.
This is actually impractical due to the high labor and time costs and the unavoidable missing annotations.
Recent works on sparsely annotated object detection alleviate this problem by generating pseudo labels for the missing annotations.
We propose a Calibrated Teacher, of which the confidence estimation of the prediction is well to match its real precision.
- Score: 35.74904852812749
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Fully supervised object detection requires training images in which all
instances are annotated. This is actually impractical due to the high labor and
time costs and the unavoidable missing annotations. As a result, the incomplete
annotation in each image could provide misleading supervision and harm the
training. Recent works on sparsely annotated object detection alleviate this
problem by generating pseudo labels for the missing annotations. Such a
mechanism is sensitive to the threshold of the pseudo label score. However, the
effective threshold is different in different training stages and among
different object detectors. Therefore, the current methods with fixed
thresholds have sub-optimal performance, and are difficult to be applied to
other detectors. In order to resolve this obstacle, we propose a Calibrated
Teacher, of which the confidence estimation of the prediction is well
calibrated to match its real precision. In this way, different detectors in
different training stages would share a similar distribution of the output
confidence, so that multiple detectors could share the same fixed threshold and
achieve better performance. Furthermore, we present a simple but effective
Focal IoU Weight (FIoU) for the classification loss. FIoU aims at reducing the
loss weight of false negative samples caused by the missing annotation, and
thus works as the complement of the teacher-student paradigm. Extensive
experiments show that our methods set new state-of-the-art under all different
sparse settings in COCO. Code will be available at
https://github.com/Whileherham/CalibratedTeacher.
Related papers
- One-bit Supervision for Image Classification: Problem, Solution, and
Beyond [114.95815360508395]
This paper presents one-bit supervision, a novel setting of learning with fewer labels, for image classification.
We propose a multi-stage training paradigm and incorporate negative label suppression into an off-the-shelf semi-supervised learning algorithm.
In multiple benchmarks, the learning efficiency of the proposed approach surpasses that using full-bit, semi-supervised supervision.
arXiv Detail & Related papers (2023-11-26T07:39:00Z) - Tackling the Incomplete Annotation Issue in Universal Lesion Detection
Task By Exploratory Training [10.627977735890191]
Universal lesion detection has great value for clinical practice as it aims to detect lesions in multiple organs on medical images.
Deep learning methods have shown promising results, but demanding large volumes of annotated data for training.
We introduce a teacher-student detection model as basis, where the teacher's predictions are combined with incomplete annotations to train the student.
arXiv Detail & Related papers (2023-09-23T08:44:07Z) - Shrinking Class Space for Enhanced Certainty in Semi-Supervised Learning [59.44422468242455]
We propose a novel method dubbed ShrinkMatch to learn uncertain samples.
For each uncertain sample, it adaptively seeks a shrunk class space, which merely contains the original top-1 class.
We then impose a consistency regularization between a pair of strongly and weakly augmented samples in the shrunk space to strive for discriminative representations.
arXiv Detail & Related papers (2023-08-13T14:05:24Z) - An End-to-End Framework For Universal Lesion Detection With Missing
Annotations [24.902835211573628]
We present a novel end-to-end framework for mining unlabeled lesions while simultaneously training the detector.
Our framework follows the teacher-student paradigm. High-confidence predictions are combined with partially-labeled ground truth for training the student model.
arXiv Detail & Related papers (2023-03-27T09:16:10Z) - MixTeacher: Mining Promising Labels with Mixed Scale Teacher for
Semi-Supervised Object Detection [22.047246997864143]
Scale variation across object instances remains a key challenge in object detection task.
We propose a novel framework that addresses the scale variation problem by introducing a mixed scale teacher.
Our experiments on MS COCO and PASCAL VOC benchmarks under various semi-supervised settings demonstrate that our method achieves new state-of-the-art performance.
arXiv Detail & Related papers (2023-03-16T03:37:54Z) - Confidence-aware Training of Smoothed Classifiers for Certified
Robustness [75.95332266383417]
We use "accuracy under Gaussian noise" as an easy-to-compute proxy of adversarial robustness for an input.
Our experiments show that the proposed method consistently exhibits improved certified robustness upon state-of-the-art training methods.
arXiv Detail & Related papers (2022-12-18T03:57:12Z) - Co-mining: Self-Supervised Learning for Sparsely Annotated Object
Detection [29.683119976550007]
We propose a simple but effective mechanism, called Co-mining, for sparsely annotated object detection.
In our Co-mining, two branches of a Siamese network predict the pseudo-label sets for each other.
Experiments are performed on MS dataset with three different sparsely annotated settings.
arXiv Detail & Related papers (2020-12-03T14:23:43Z) - One-bit Supervision for Image Classification [121.87598671087494]
One-bit supervision is a novel setting of learning from incomplete annotations.
We propose a multi-stage training paradigm which incorporates negative label suppression into an off-the-shelf semi-supervised learning algorithm.
arXiv Detail & Related papers (2020-09-14T03:06:23Z) - Detection as Regression: Certified Object Detection by Median Smoothing [50.89591634725045]
This work is motivated by recent progress on certified classification by randomized smoothing.
We obtain the first model-agnostic, training-free, and certified defense for object detection against $ell$-bounded attacks.
arXiv Detail & Related papers (2020-07-07T18:40:19Z) - Learning a Unified Sample Weighting Network for Object Detection [113.98404690619982]
Region sampling or weighting is significantly important to the success of modern region-based object detectors.
We argue that sample weighting should be data-dependent and task-dependent.
We propose a unified sample weighting network to predict a sample's task weights.
arXiv Detail & Related papers (2020-06-11T16:19:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.