Bridging Non Co-occurrence with Unlabeled In-the-wild Data for
Incremental Object Detection
- URL: http://arxiv.org/abs/2110.15017v1
- Date: Thu, 28 Oct 2021 10:57:25 GMT
- Title: Bridging Non Co-occurrence with Unlabeled In-the-wild Data for
Incremental Object Detection
- Authors: Na Dong, Yongqiang Zhang, Mingli Ding, Gim Hee Lee
- Abstract summary: Several incremental learning methods are proposed to mitigate catastrophic forgetting for object detection.
Despite the effectiveness, these methods require co-occurrence of the unlabeled base classes in the training data of the novel classes.
We propose the use of unlabeled in-the-wild data to bridge the non-occurrence caused by the missing base classes during the training of additional novel classes.
- Score: 56.22467011292147
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep networks have shown remarkable results in the task of object detection.
However, their performance suffers critical drops when they are subsequently
trained on novel classes without any sample from the base classes originally
used to train the model. This phenomenon is known as catastrophic forgetting.
Recently, several incremental learning methods are proposed to mitigate
catastrophic forgetting for object detection. Despite the effectiveness, these
methods require co-occurrence of the unlabeled base classes in the training
data of the novel classes. This requirement is impractical in many real-world
settings since the base classes do not necessarily co-occur with the novel
classes. In view of this limitation, we consider a more practical setting of
complete absence of co-occurrence of the base and novel classes for the object
detection task. We propose the use of unlabeled in-the-wild data to bridge the
non co-occurrence caused by the missing base classes during the training of
additional novel classes. To this end, we introduce a blind sampling strategy
based on the responses of the base-class model and pre-trained novel-class
model to select a smaller relevant dataset from the large in-the-wild dataset
for incremental learning. We then design a dual-teacher distillation framework
to transfer the knowledge distilled from the base- and novel-class teacher
models to the student model using the sampled in-the-wild data. Experimental
results on the PASCAL VOC and MS COCO datasets show that our proposed method
significantly outperforms other state-of-the-art class-incremental object
detection methods when there is no co-occurrence between the base and novel
classes during training.
Related papers
- Open-Vocabulary Object Detection with Meta Prompt Representation and Instance Contrastive Optimization [63.66349334291372]
We propose a framework with Meta prompt and Instance Contrastive learning (MIC) schemes.
Firstly, we simulate a novel-class-emerging scenario to help the prompt that learns class and background prompts generalize to novel classes.
Secondly, we design an instance-level contrastive strategy to promote intra-class compactness and inter-class separation, which benefits generalization of the detector to novel class objects.
arXiv Detail & Related papers (2024-03-14T14:25:10Z) - Class-incremental Novel Class Discovery [76.35226130521758]
We study the new task of class-incremental Novel Class Discovery (class-iNCD)
We propose a novel approach for class-iNCD which prevents forgetting of past information about the base classes.
Our experiments, conducted on three common benchmarks, demonstrate that our method significantly outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2022-07-18T13:49:27Z) - Multi-Granularity Regularized Re-Balancing for Class Incremental
Learning [32.52884416761171]
Deep learning models suffer from catastrophic forgetting when learning new tasks.
Data imbalance between old and new classes is a key issue that leads to performance degradation of the model.
We propose an assumption-agnostic method, Multi-Granularity Regularized re-Balancing, to address this problem.
arXiv Detail & Related papers (2022-06-30T11:04:51Z) - Class Impression for Data-free Incremental Learning [20.23329169244367]
Deep learning-based classification approaches require collecting all samples from all classes in advance and are trained offline.
This paradigm may not be practical in real-world clinical applications, where new classes are incrementally introduced through the addition of new data.
We propose a novel data-free class incremental learning framework that first synthesizes data from the model trained on previous classes to generate a ours.
arXiv Detail & Related papers (2022-06-26T06:20:17Z) - Label Hallucination for Few-Shot Classification [40.43730385915566]
Few-shot classification requires adapting knowledge learned from a large annotated base dataset to recognize novel unseen classes.
We propose an alternative approach to both of these two popular strategies.
We show that our method outperforms the state-of-the-art on four well-established few-shot classification benchmarks.
arXiv Detail & Related papers (2021-12-06T20:18:41Z) - Open-World Semi-Supervised Learning [66.90703597468377]
We introduce a new open-world semi-supervised learning setting in which the model is required to recognize previously seen classes.
We propose ORCA, an approach that learns to simultaneously classify and cluster the data.
We demonstrate that ORCA accurately discovers novel classes and assigns samples to previously seen classes on benchmark image classification datasets.
arXiv Detail & Related papers (2021-02-06T07:11:07Z) - Learning Adaptive Embedding Considering Incremental Class [55.21855842960139]
Class-Incremental Learning (CIL) aims to train a reliable model with the streaming data, which emerges unknown classes sequentially.
Different from traditional closed set learning, CIL has two main challenges: 1) Novel class detection.
After the novel classes are detected, the model needs to be updated without re-training using entire previous data.
arXiv Detail & Related papers (2020-08-31T04:11:24Z) - Two-Level Residual Distillation based Triple Network for Incremental
Object Detection [21.725878050355824]
We propose a novel incremental object detector based on Faster R-CNN to continuously learn from new object classes without using old data.
It is a triple network where an old model and a residual model as assistants for helping the incremental model learning on new classes without forgetting the previous learned knowledge.
arXiv Detail & Related papers (2020-07-27T11:04:57Z) - Incremental Few-Shot Object Detection [96.02543873402813]
OpeN-ended Centre nEt is a detector for incrementally learning to detect class objects with few examples.
ONCE fully respects the incremental learning paradigm, with novel class registration requiring only a single forward pass of few-shot training samples.
arXiv Detail & Related papers (2020-03-10T12:56:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.