Adaptive Discovering and Merging for Incremental Novel Class Discovery
- URL: http://arxiv.org/abs/2403.03382v1
- Date: Wed, 6 Mar 2024 00:17:03 GMT
- Title: Adaptive Discovering and Merging for Incremental Novel Class Discovery
- Authors: Guangyao Chen, Peixi Peng, Yangru Huang, Mengyue Geng, Yonghong Tian
- Abstract summary: We introduce a new paradigm called Adaptive Discovering and Merging (ADM) to discover novel categories adaptively in the incremental stage.
Our AMM also benefits the class-incremental Learning (class-IL) task by alleviating the catastrophic forgetting problem.
- Score: 37.54881512688964
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One important desideratum of lifelong learning aims to discover novel classes
from unlabelled data in a continuous manner. The central challenge is twofold:
discovering and learning novel classes while mitigating the issue of
catastrophic forgetting of established knowledge. To this end, we introduce a
new paradigm called Adaptive Discovering and Merging (ADM) to discover novel
categories adaptively in the incremental stage and integrate novel knowledge
into the model without affecting the original knowledge. To discover novel
classes adaptively, we decouple representation learning and novel class
discovery, and use Triple Comparison (TC) and Probability Regularization (PR)
to constrain the probability discrepancy and diversity for adaptive category
assignment. To merge the learned novel knowledge adaptively, we propose a
hybrid structure with base and novel branches named Adaptive Model Merging
(AMM), which reduces the interference of the novel branch on the old classes to
preserve the previous knowledge, and merges the novel branch to the base model
without performance loss and parameter growth. Extensive experiments on several
datasets show that ADM significantly outperforms existing class-incremental
Novel Class Discovery (class-iNCD) approaches. Moreover, our AMM also benefits
the class-incremental Learning (class-IL) task by alleviating the catastrophic
forgetting problem.
Related papers
- Continual Novel Class Discovery via Feature Enhancement and Adaptation [20.669216392440145]
We propose a novel Feature Enhancement and Adaptation method for the Continual Novel Class Discovery (CNCD)
The guide-to-novel framework is established to continually discover novel classes under the guidance of prior distribution.
The centroid-to-samples similarity constraint (CSS) is designed to constrain the relationship between centroid-to-samples similarities of different classes.
The boundary-aware prototype constraint (BAP) is proposed to keep novel class features aware of the positions of other class prototypes.
arXiv Detail & Related papers (2024-05-10T10:52:22Z) - Class-relation Knowledge Distillation for Novel Class Discovery [16.461242381109276]
Key challenge lies in transferring the knowledge in the known-class data to the learning of novel classes.
We introduce a class relation representation for the novel classes based on the predicted class distribution of a model trained on known classes.
We propose a novel knowledge distillation framework, which utilizes our class-relation representation to regularize the learning of novel classes.
arXiv Detail & Related papers (2023-07-18T11:35:57Z) - Memorizing Complementation Network for Few-Shot Class-Incremental
Learning [109.4206979528375]
We propose a Memorizing Complementation Network (MCNet) to ensemble multiple models that complements the different memorized knowledge with each other in novel tasks.
We develop a Prototype Smoothing Hard-mining Triplet (PSHT) loss to push the novel samples away from not only each other in current task but also the old distribution.
arXiv Detail & Related papers (2022-08-11T02:32:41Z) - Novel Class Discovery without Forgetting [72.52222295216062]
We identify and formulate a new, pragmatic problem setting of NCDwF: Novel Class Discovery without Forgetting.
We propose a machine learning model to incrementally discover novel categories of instances from unlabeled data.
We introduce experimental protocols based on CIFAR-10, CIFAR-100 and ImageNet-1000 to measure the trade-off between knowledge retention and novel class discovery.
arXiv Detail & Related papers (2022-07-21T17:54:36Z) - Class-incremental Novel Class Discovery [76.35226130521758]
We study the new task of class-incremental Novel Class Discovery (class-iNCD)
We propose a novel approach for class-iNCD which prevents forgetting of past information about the base classes.
Our experiments, conducted on three common benchmarks, demonstrate that our method significantly outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2022-07-18T13:49:27Z) - Incremental Few-Shot Learning via Implanting and Compressing [13.122771115838523]
Incremental Few-Shot Learning requires a model to continually learn novel classes from only a few examples.
We propose a two-step learning strategy referred to as textbfImplanting and textbfCompressing.
Specifically, in the textbfImplanting step, we propose to mimic the data distribution of novel classes with the assistance of data-abundant base set.
In the textbf step, we adapt the feature extractor to precisely represent each novel class for enhancing intra-class compactness.
arXiv Detail & Related papers (2022-03-19T11:04:43Z) - Novel Class Discovery in Semantic Segmentation [104.30729847367104]
We introduce a new setting of Novel Class Discovery in Semantic (NCDSS)
It aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes.
In NCDSS, we need to distinguish the objects and background, and to handle the existence of multiple classes within an image.
We propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels.
arXiv Detail & Related papers (2021-12-03T13:31:59Z) - Learning Adaptive Embedding Considering Incremental Class [55.21855842960139]
Class-Incremental Learning (CIL) aims to train a reliable model with the streaming data, which emerges unknown classes sequentially.
Different from traditional closed set learning, CIL has two main challenges: 1) Novel class detection.
After the novel classes are detected, the model needs to be updated without re-training using entire previous data.
arXiv Detail & Related papers (2020-08-31T04:11:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.