PromptCAL: Contrastive Affinity Learning via Auxiliary Prompts for
Generalized Novel Category Discovery
- URL: http://arxiv.org/abs/2212.05590v2
- Date: Sun, 26 Mar 2023 10:30:22 GMT
- Title: PromptCAL: Contrastive Affinity Learning via Auxiliary Prompts for
Generalized Novel Category Discovery
- Authors: Sheng Zhang, Salman Khan, Zhiqiang Shen, Muzammal Naseer, Guangyi
Chen, Fahad Khan
- Abstract summary: Generalized Novel Category Discovery (GNCD) setting aims to categorize unlabeled training data coming from known and novel classes.
We propose Contrastive Affinity Learning method with auxiliary visual Prompts, dubbed PromptCAL, to address this challenging problem.
Our approach discovers reliable pairwise sample affinities to learn better semantic clustering of both known and novel classes for the class token and visual prompts.
- Score: 39.03732147384566
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Although existing semi-supervised learning models achieve remarkable success
in learning with unannotated in-distribution data, they mostly fail to learn on
unlabeled data sampled from novel semantic classes due to their closed-set
assumption. In this work, we target a pragmatic but under-explored Generalized
Novel Category Discovery (GNCD) setting. The GNCD setting aims to categorize
unlabeled training data coming from known and novel classes by leveraging the
information of partially labeled known classes. We propose a two-stage
Contrastive Affinity Learning method with auxiliary visual Prompts, dubbed
PromptCAL, to address this challenging problem. Our approach discovers reliable
pairwise sample affinities to learn better semantic clustering of both known
and novel classes for the class token and visual prompts. First, we propose a
discriminative prompt regularization loss to reinforce semantic
discriminativeness of prompt-adapted pre-trained vision transformer for refined
affinity relationships.Besides, we propose contrastive affinity learning to
calibrate semantic representations based on our iterative semi-supervised
affinity graph generation method for semantically-enhanced supervision.
Extensive experimental evaluation demonstrates that our PromptCAL method is
more effective in discovering novel classes even with limited annotations and
surpasses the current state-of-the-art on generic and fine-grained benchmarks
(e.g., with nearly 11% gain on CUB-200, and 9% on ImageNet-100) on overall
accuracy. Our code is available at https://github.com/sheng-eatamath/PromptCAL.
Related papers
- Memory Consistency Guided Divide-and-Conquer Learning for Generalized
Category Discovery [56.172872410834664]
Generalized category discovery (GCD) aims at addressing a more realistic and challenging setting of semi-supervised learning.
We propose a Memory Consistency guided Divide-and-conquer Learning framework (MCDL)
Our method outperforms state-of-the-art models by a large margin on both seen and unseen classes of the generic image recognition.
arXiv Detail & Related papers (2024-01-24T09:39:45Z) - MetaGCD: Learning to Continually Learn in Generalized Category Discovery [26.732455383707798]
We consider a real-world scenario where a model that is trained on pre-defined classes continually encounters unlabeled data.
The goal is to continually discover novel classes while maintaining the performance in known classes.
We propose an approach, called MetaGCD, to learn how to incrementally discover with less forgetting.
arXiv Detail & Related papers (2023-08-21T22:16:49Z) - Dynamic Conceptional Contrastive Learning for Generalized Category
Discovery [76.82327473338734]
Generalized category discovery (GCD) aims to automatically cluster partially labeled data.
Unlabeled data contain instances that are not only from known categories of the labeled data but also from novel categories.
One effective way for GCD is applying self-supervised learning to learn discriminate representation for unlabeled data.
We propose a Dynamic Conceptional Contrastive Learning framework, which can effectively improve clustering accuracy.
arXiv Detail & Related papers (2023-03-30T14:04:39Z) - Novel Class Discovery without Forgetting [72.52222295216062]
We identify and formulate a new, pragmatic problem setting of NCDwF: Novel Class Discovery without Forgetting.
We propose a machine learning model to incrementally discover novel categories of instances from unlabeled data.
We introduce experimental protocols based on CIFAR-10, CIFAR-100 and ImageNet-1000 to measure the trade-off between knowledge retention and novel class discovery.
arXiv Detail & Related papers (2022-07-21T17:54:36Z) - New Intent Discovery with Pre-training and Contrastive Learning [21.25371293641141]
New intent discovery aims to uncover novel intent categories from user utterances to expand the set of supported intent classes.
Existing approaches typically rely on a large amount of labeled utterances.
We propose a new contrastive loss to exploit self-supervisory signals in unlabeled data for clustering.
arXiv Detail & Related papers (2022-05-25T17:07:25Z) - Neighborhood Contrastive Learning for Novel Class Discovery [79.14767688903028]
We build a new framework, named Neighborhood Contrastive Learning, to learn discriminative representations that are important to clustering performance.
We experimentally demonstrate that these two ingredients significantly contribute to clustering performance and lead our model to outperform state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2021-06-20T17:34:55Z) - Dynamic Semantic Matching and Aggregation Network for Few-shot Intent
Detection [69.2370349274216]
Few-shot Intent Detection is challenging due to the scarcity of available annotated utterances.
Semantic components are distilled from utterances via multi-head self-attention.
Our method provides a comprehensive matching measure to enhance representations of both labeled and unlabeled instances.
arXiv Detail & Related papers (2020-10-06T05:16:38Z) - Towards Cross-Granularity Few-Shot Learning: Coarse-to-Fine
Pseudo-Labeling with Visual-Semantic Meta-Embedding [13.063136901934865]
Few-shot learning aims at rapidly adapting to novel categories with only a handful of samples at test time.
In this paper, we advance the few-shot classification paradigm towards a more challenging scenario, i.e., cross-granularity few-shot classification.
We approximate the fine-grained data distribution by greedy clustering of each coarse-class into pseudo-fine-classes according to the similarity of image embeddings.
arXiv Detail & Related papers (2020-07-11T03:44:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.