Composing Novel Classes: A Concept-Driven Approach to Generalized Category Discovery
- URL: http://arxiv.org/abs/2410.13285v1
- Date: Thu, 17 Oct 2024 07:30:20 GMT
- Title: Composing Novel Classes: A Concept-Driven Approach to Generalized Category Discovery
- Authors: Chuyu Zhang, Peiyan Gu, Xueyang Yu, Xuming He,
- Abstract summary: We tackle the generalized category discovery problem, which aims to discover novel classes in unlabeled datasets.
We introduce a novel concept learning framework for GCD, named ConceptGCD, that categorizes concepts into two types: derivable and underivable.
Our framework first extracts known class concepts by a known class pre-trained model and then produces derivable concepts from them.
- Score: 13.68907640197364
- License:
- Abstract: We tackle the generalized category discovery (GCD) problem, which aims to discover novel classes in unlabeled datasets by leveraging the knowledge of known classes. Previous works utilize the known class knowledge through shared representation spaces. Despite their progress, our analysis experiments show that novel classes can achieve impressive clustering results on the feature space of a known class pre-trained model, suggesting that existing methods may not fully utilize known class knowledge. To address it, we introduce a novel concept learning framework for GCD, named ConceptGCD, that categorizes concepts into two types: derivable and underivable from known class concepts, and adopts a stage-wise learning strategy to learn them separately. Specifically, our framework first extracts known class concepts by a known class pre-trained model and then produces derivable concepts from them by a generator layer with a covariance-augmented loss. Subsequently, we expand the generator layer to learn underivable concepts in a balanced manner ensured by a concept score normalization strategy and integrate a contrastive loss to preserve previously learned concepts. Extensive experiments on various benchmark datasets demonstrate the superiority of our approach over the previous state-of-the-art methods. Code will be available soon.
Related papers
- SelEx: Self-Expertise in Fine-Grained Generalized Category Discovery [55.72840638180451]
Generalized Category Discovery aims to simultaneously uncover novel categories and accurately classify known ones.
Traditional methods, which lean heavily on self-supervision and contrastive learning, often fall short when distinguishing between fine-grained categories.
We introduce a novel concept called self-expertise', which enhances the model's ability to recognize subtle differences and uncover unknown categories.
arXiv Detail & Related papers (2024-08-26T15:53:50Z) - Discover-then-Name: Task-Agnostic Concept Bottlenecks via Automated Concept Discovery [52.498055901649025]
Concept Bottleneck Models (CBMs) have been proposed to address the 'black-box' problem of deep neural networks.
We propose a novel CBM approach -- called Discover-then-Name-CBM (DN-CBM) -- that inverts the typical paradigm.
Our concept extraction strategy is efficient, since it is agnostic to the downstream task, and uses concepts already known to the model.
arXiv Detail & Related papers (2024-07-19T17:50:11Z) - Self-Cooperation Knowledge Distillation for Novel Class Discovery [8.984031974257274]
Novel Class Discovery (NCD) aims to discover unknown and novel classes in an unlabeled set by leveraging knowledge already learned about known classes.
We propose a Self-Cooperation Knowledge Distillation (SCKD) method to utilize each training sample (whether known or novel, labeled or unlabeled) for both review and discovery.
arXiv Detail & Related papers (2024-07-02T03:49:48Z) - Few-Shot Class-Incremental Learning via Training-Free Prototype
Calibration [67.69532794049445]
We find a tendency for existing methods to misclassify the samples of new classes into base classes, which leads to the poor performance of new classes.
We propose a simple yet effective Training-frEE calibratioN (TEEN) strategy to enhance the discriminability of new classes.
arXiv Detail & Related papers (2023-12-08T18:24:08Z) - Class-relation Knowledge Distillation for Novel Class Discovery [16.461242381109276]
Key challenge lies in transferring the knowledge in the known-class data to the learning of novel classes.
We introduce a class relation representation for the novel classes based on the predicted class distribution of a model trained on known classes.
We propose a novel knowledge distillation framework, which utilizes our class-relation representation to regularize the learning of novel classes.
arXiv Detail & Related papers (2023-07-18T11:35:57Z) - Multi-Faceted Distillation of Base-Novel Commonality for Few-shot Object
Detection [58.48995335728938]
We learn three types of class-agnostic commonalities between base and novel classes explicitly.
Our method can be readily integrated into most of existing fine-tuning based methods and consistently improve the performance by a large margin.
arXiv Detail & Related papers (2022-07-22T16:46:51Z) - Class-incremental Novel Class Discovery [76.35226130521758]
We study the new task of class-incremental Novel Class Discovery (class-iNCD)
We propose a novel approach for class-iNCD which prevents forgetting of past information about the base classes.
Our experiments, conducted on three common benchmarks, demonstrate that our method significantly outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2022-07-18T13:49:27Z) - Mutual Information-guided Knowledge Transfer for Novel Class Discovery [23.772336970389834]
We propose a principle and general method to transfer semantic knowledge between seen and unseen classes.
Our results show that the proposed method outperforms previous SOTA by a significant margin on several benchmarks.
arXiv Detail & Related papers (2022-06-24T03:52:25Z) - Open-Set Representation Learning through Combinatorial Embedding [62.05670732352456]
We are interested in identifying novel concepts in a dataset through representation learning based on the examples in both labeled and unlabeled classes.
We propose a learning approach, which naturally clusters examples in unseen classes using the compositional knowledge given by multiple supervised meta-classifiers on heterogeneous label spaces.
The proposed algorithm discovers novel concepts via a joint optimization of enhancing the discrimitiveness of unseen classes as well as learning the representations of known classes generalizable to novel ones.
arXiv Detail & Related papers (2021-06-29T11:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.