Mutual Information-guided Knowledge Transfer for Novel Class Discovery
- URL: http://arxiv.org/abs/2206.12063v1
- Date: Fri, 24 Jun 2022 03:52:25 GMT
- Title: Mutual Information-guided Knowledge Transfer for Novel Class Discovery
- Authors: Chuyu Zhang, Chuanyang Hu, Ruijie Xu, Zhitong Gao, Qian He, Xuming He
- Abstract summary: We propose a principle and general method to transfer semantic knowledge between seen and unseen classes.
Our results show that the proposed method outperforms previous SOTA by a significant margin on several benchmarks.
- Score: 23.772336970389834
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We tackle the novel class discovery problem, aiming to discover novel classes
in unlabeled data based on labeled data from seen classes. The main challenge
is to transfer knowledge contained in the seen classes to unseen ones. Previous
methods mostly transfer knowledge through sharing representation space or joint
label space. However, they tend to neglect the class relation between seen and
unseen categories, and thus the learned representations are less effective for
clustering unseen classes. In this paper, we propose a principle and general
method to transfer semantic knowledge between seen and unseen classes. Our
insight is to utilize mutual information to measure the relation between seen
classes and unseen classes in a restricted label space and maximizing mutual
information promotes transferring semantic knowledge. To validate the
effectiveness and generalization of our method, we conduct extensive
experiments both on novel class discovery and general novel class discovery
settings. Our results show that the proposed method outperforms previous SOTA
by a significant margin on several benchmarks.
Related papers
- Self-Cooperation Knowledge Distillation for Novel Class Discovery [8.984031974257274]
Novel Class Discovery (NCD) aims to discover unknown and novel classes in an unlabeled set by leveraging knowledge already learned about known classes.
We propose a Self-Cooperation Knowledge Distillation (SCKD) method to utilize each training sample (whether known or novel, labeled or unlabeled) for both review and discovery.
arXiv Detail & Related papers (2024-07-02T03:49:48Z) - Enhancing Visual Continual Learning with Language-Guided Supervision [76.38481740848434]
Continual learning aims to empower models to learn new tasks without forgetting previously acquired knowledge.
We argue that the scarce semantic information conveyed by the one-hot labels hampers the effective knowledge transfer across tasks.
Specifically, we use PLMs to generate semantic targets for each class, which are frozen and serve as supervision signals.
arXiv Detail & Related papers (2024-03-24T12:41:58Z) - Class-relation Knowledge Distillation for Novel Class Discovery [16.461242381109276]
Key challenge lies in transferring the knowledge in the known-class data to the learning of novel classes.
We introduce a class relation representation for the novel classes based on the predicted class distribution of a model trained on known classes.
We propose a novel knowledge distillation framework, which utilizes our class-relation representation to regularize the learning of novel classes.
arXiv Detail & Related papers (2023-07-18T11:35:57Z) - Novel Class Discovery without Forgetting [72.52222295216062]
We identify and formulate a new, pragmatic problem setting of NCDwF: Novel Class Discovery without Forgetting.
We propose a machine learning model to incrementally discover novel categories of instances from unlabeled data.
We introduce experimental protocols based on CIFAR-10, CIFAR-100 and ImageNet-1000 to measure the trade-off between knowledge retention and novel class discovery.
arXiv Detail & Related papers (2022-07-21T17:54:36Z) - New Intent Discovery with Pre-training and Contrastive Learning [21.25371293641141]
New intent discovery aims to uncover novel intent categories from user utterances to expand the set of supported intent classes.
Existing approaches typically rely on a large amount of labeled utterances.
We propose a new contrastive loss to exploit self-supervisory signals in unlabeled data for clustering.
arXiv Detail & Related papers (2022-05-25T17:07:25Z) - Long-tail Recognition via Compositional Knowledge Transfer [60.03764547406601]
We introduce a novel strategy for long-tail recognition that addresses the tail classes' few-shot problem.
Our objective is to transfer knowledge acquired from information-rich common classes to semantically similar, and yet data-hungry, rare classes.
Experiments show that our approach can achieve significant performance boosts on rare classes while maintaining robust common class performance.
arXiv Detail & Related papers (2021-12-13T15:48:59Z) - Novel Visual Category Discovery with Dual Ranking Statistics and Mutual
Knowledge Distillation [16.357091285395285]
We tackle the problem of grouping unlabelled images from new classes into different semantic partitions.
This is a more realistic and challenging setting than conventional semi-supervised learning.
We propose a two-branch learning framework for this problem, with one branch focusing on local part-level information and the other branch focusing on overall characteristics.
arXiv Detail & Related papers (2021-07-07T17:14:40Z) - Open-Set Representation Learning through Combinatorial Embedding [62.05670732352456]
We are interested in identifying novel concepts in a dataset through representation learning based on the examples in both labeled and unlabeled classes.
We propose a learning approach, which naturally clusters examples in unseen classes using the compositional knowledge given by multiple supervised meta-classifiers on heterogeneous label spaces.
The proposed algorithm discovers novel concepts via a joint optimization of enhancing the discrimitiveness of unseen classes as well as learning the representations of known classes generalizable to novel ones.
arXiv Detail & Related papers (2021-06-29T11:51:57Z) - Efficient Conditional GAN Transfer with Knowledge Propagation across
Classes [85.38369543858516]
CGANs provide new opportunities for knowledge transfer compared to unconditional setup.
New classes may borrow knowledge from the related old classes, or share knowledge among themselves to improve the training.
New GAN transfer method explicitly propagates the knowledge from the old classes to the new classes.
arXiv Detail & Related papers (2021-02-12T18:55:34Z) - Automatically Discovering and Learning New Visual Categories with
Ranking Statistics [145.89790963544314]
We tackle the problem of discovering novel classes in an image collection given labelled examples of other classes.
We learn a general-purpose clustering model and use the latter to identify the new classes in the unlabelled data.
We evaluate our approach on standard classification benchmarks and outperform current methods for novel category discovery by a significant margin.
arXiv Detail & Related papers (2020-02-13T18:53:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.