Class-incremental Novel Class Discovery
- URL: http://arxiv.org/abs/2207.08605v1
- Date: Mon, 18 Jul 2022 13:49:27 GMT
- Title: Class-incremental Novel Class Discovery
- Authors: Subhankar Roy, Mingxuan Liu, Zhun Zhong, Nicu Sebe, Elisa Ricci
- Abstract summary: We study the new task of class-incremental Novel Class Discovery (class-iNCD)
We propose a novel approach for class-iNCD which prevents forgetting of past information about the base classes.
Our experiments, conducted on three common benchmarks, demonstrate that our method significantly outperforms state-of-the-art approaches.
- Score: 76.35226130521758
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study the new task of class-incremental Novel Class Discovery
(class-iNCD), which refers to the problem of discovering novel categories in an
unlabelled data set by leveraging a pre-trained model that has been trained on
a labelled data set containing disjoint yet related categories. Apart from
discovering novel classes, we also aim at preserving the ability of the model
to recognize previously seen base categories. Inspired by rehearsal-based
incremental learning methods, in this paper we propose a novel approach for
class-iNCD which prevents forgetting of past information about the base classes
by jointly exploiting base class feature prototypes and feature-level knowledge
distillation. We also propose a self-training clustering strategy that
simultaneously clusters novel categories and trains a joint classifier for both
the base and novel classes. This makes our method able to operate in a
class-incremental setting. Our experiments, conducted on three common
benchmarks, demonstrate that our method significantly outperforms
state-of-the-art approaches. Code is available at
https://github.com/OatmealLiu/class-iNCD
Related papers
- NC-NCD: Novel Class Discovery for Node Classification [28.308556235456766]
Class Discovery (NCD) involves identifying new categories within unlabeled data by utilizing knowledge acquired from previously established categories.
Existing NCD methods often struggle to maintain a balance between the performance of old and new categories.
We introduce for the first time a more practical NCD scenario for node classification (i.e., NC-NCD)
We propose a novel self-training framework with prototype replay and distillation called SWORD, adopted to our NC-NCD setting.
arXiv Detail & Related papers (2024-07-25T07:10:08Z) - Organizing Background to Explore Latent Classes for Incremental Few-shot Semantic Segmentation [7.570798966278471]
incremental Few-shot Semantic COCO (iFSS) is to extend pre-trained segmentation models to new classes via few annotated images.
We propose a network called OINet, i.e., the background embedding space textbfOrganization and prototype textbfInherit Network.
arXiv Detail & Related papers (2024-05-29T23:22:12Z) - ProxyDet: Synthesizing Proxy Novel Classes via Classwise Mixup for
Open-Vocabulary Object Detection [7.122652901894367]
Open-vocabulary object detection (OVOD) aims to recognize novel objects whose categories are not included in the training set.
We present a novel, yet simple technique that helps generalization on the overall distribution of novel classes.
arXiv Detail & Related papers (2023-12-12T13:45:56Z) - Few-Shot Class-Incremental Learning via Training-Free Prototype
Calibration [67.69532794049445]
We find a tendency for existing methods to misclassify the samples of new classes into base classes, which leads to the poor performance of new classes.
We propose a simple yet effective Training-frEE calibratioN (TEEN) strategy to enhance the discriminability of new classes.
arXiv Detail & Related papers (2023-12-08T18:24:08Z) - Class-relation Knowledge Distillation for Novel Class Discovery [16.461242381109276]
Key challenge lies in transferring the knowledge in the known-class data to the learning of novel classes.
We introduce a class relation representation for the novel classes based on the predicted class distribution of a model trained on known classes.
We propose a novel knowledge distillation framework, which utilizes our class-relation representation to regularize the learning of novel classes.
arXiv Detail & Related papers (2023-07-18T11:35:57Z) - Automatically Discovering Novel Visual Categories with Self-supervised
Prototype Learning [68.63910949916209]
This paper tackles the problem of novel category discovery (NCD), which aims to discriminate unknown categories in large-scale image collections.
We propose a novel adaptive prototype learning method consisting of two main stages: prototypical representation learning and prototypical self-training.
We conduct extensive experiments on four benchmark datasets and demonstrate the effectiveness and robustness of the proposed method with state-of-the-art performance.
arXiv Detail & Related papers (2022-08-01T16:34:33Z) - Class-Incremental Learning with Strong Pre-trained Models [97.84755144148535]
Class-incremental learning (CIL) has been widely studied under the setting of starting from a small number of classes (base classes)
We explore an understudied real-world setting of CIL that starts with a strong model pre-trained on a large number of base classes.
Our proposed method is robust and generalizes to all analyzed CIL settings.
arXiv Detail & Related papers (2022-04-07T17:58:07Z) - Bridging Non Co-occurrence with Unlabeled In-the-wild Data for
Incremental Object Detection [56.22467011292147]
Several incremental learning methods are proposed to mitigate catastrophic forgetting for object detection.
Despite the effectiveness, these methods require co-occurrence of the unlabeled base classes in the training data of the novel classes.
We propose the use of unlabeled in-the-wild data to bridge the non-occurrence caused by the missing base classes during the training of additional novel classes.
arXiv Detail & Related papers (2021-10-28T10:57:25Z) - Learning Adaptive Embedding Considering Incremental Class [55.21855842960139]
Class-Incremental Learning (CIL) aims to train a reliable model with the streaming data, which emerges unknown classes sequentially.
Different from traditional closed set learning, CIL has two main challenges: 1) Novel class detection.
After the novel classes are detected, the model needs to be updated without re-training using entire previous data.
arXiv Detail & Related papers (2020-08-31T04:11:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.