Fine-grained Category Discovery under Coarse-grained supervision with
Hierarchical Weighted Self-contrastive Learning
- URL: http://arxiv.org/abs/2210.07733v1
- Date: Fri, 14 Oct 2022 12:06:23 GMT
- Title: Fine-grained Category Discovery under Coarse-grained supervision with
Hierarchical Weighted Self-contrastive Learning
- Authors: Wenbin An, Feng Tian, Ping Chen, Siliang Tang, Qinghua Zheng, QianYing
Wang
- Abstract summary: We investigate a new practical scenario called Fine-grained Category Discovery under Coarse-grained supervision (FCDC)
FCDC aims at discovering fine-grained categories with only coarse-grained labeled data, which can adapt models to categories of different granularity from known ones and reduce significant labeling cost.
We propose a hierarchical weighted self-contrastive network by building a novel weighted self-contrastive module and combining it with supervised learning in a hierarchical manner.
- Score: 37.6512548064269
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Novel category discovery aims at adapting models trained on known categories
to novel categories. Previous works only focus on the scenario where known and
novel categories are of the same granularity. In this paper, we investigate a
new practical scenario called Fine-grained Category Discovery under
Coarse-grained supervision (FCDC). FCDC aims at discovering fine-grained
categories with only coarse-grained labeled data, which can adapt models to
categories of different granularity from known ones and reduce significant
labeling cost. It is also a challenging task since supervised training on
coarse-grained categories tends to focus on inter-class distance (distance
between coarse-grained classes) but ignore intra-class distance (distance
between fine-grained sub-classes) which is essential for separating
fine-grained categories. Considering most current methods cannot transfer
knowledge from coarse-grained level to fine-grained level, we propose a
hierarchical weighted self-contrastive network by building a novel weighted
self-contrastive module and combining it with supervised learning in a
hierarchical manner. Extensive experiments on public datasets show both
effectiveness and efficiency of our model over compared methods. Code and data
are available at https://github.com/Lackel/Hierarchical_Weighted_SCL.
Related papers
- Learn to Categorize or Categorize to Learn? Self-Coding for Generalized
Category Discovery [49.1865089933055]
We propose a novel, efficient and self-supervised method capable of discovering previously unknown categories at test time.
A salient feature of our approach is the assignment of minimum length category codes to individual data instances.
Experimental evaluations, bolstered by state-of-the-art benchmark comparisons, testify to the efficacy of our solution.
arXiv Detail & Related papers (2023-10-30T17:45:32Z) - Dynamic Conceptional Contrastive Learning for Generalized Category
Discovery [76.82327473338734]
Generalized category discovery (GCD) aims to automatically cluster partially labeled data.
Unlabeled data contain instances that are not only from known categories of the labeled data but also from novel categories.
One effective way for GCD is applying self-supervised learning to learn discriminate representation for unlabeled data.
We propose a Dynamic Conceptional Contrastive Learning framework, which can effectively improve clustering accuracy.
arXiv Detail & Related papers (2023-03-30T14:04:39Z) - XCon: Learning with Experts for Fine-grained Category Discovery [4.787507865427207]
We present a novel method called Expert-Contrastive Learning (XCon) to help the model to mine useful information from the images.
Experiments on fine-grained datasets show a clear improved performance over the previous best methods, indicating the effectiveness of our method.
arXiv Detail & Related papers (2022-08-03T08:03:12Z) - Novel Class Discovery without Forgetting [72.52222295216062]
We identify and formulate a new, pragmatic problem setting of NCDwF: Novel Class Discovery without Forgetting.
We propose a machine learning model to incrementally discover novel categories of instances from unlabeled data.
We introduce experimental protocols based on CIFAR-10, CIFAR-100 and ImageNet-1000 to measure the trade-off between knowledge retention and novel class discovery.
arXiv Detail & Related papers (2022-07-21T17:54:36Z) - Class-incremental Novel Class Discovery [76.35226130521758]
We study the new task of class-incremental Novel Class Discovery (class-iNCD)
We propose a novel approach for class-iNCD which prevents forgetting of past information about the base classes.
Our experiments, conducted on three common benchmarks, demonstrate that our method significantly outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2022-07-18T13:49:27Z) - Bridging Non Co-occurrence with Unlabeled In-the-wild Data for
Incremental Object Detection [56.22467011292147]
Several incremental learning methods are proposed to mitigate catastrophic forgetting for object detection.
Despite the effectiveness, these methods require co-occurrence of the unlabeled base classes in the training data of the novel classes.
We propose the use of unlabeled in-the-wild data to bridge the non-occurrence caused by the missing base classes during the training of additional novel classes.
arXiv Detail & Related papers (2021-10-28T10:57:25Z) - CaT: Weakly Supervised Object Detection with Category Transfer [41.34509685442456]
A large gap exists between fully-supervised object detection and weakly-supervised object detection.
We propose a novel category transfer framework for weakly supervised object detection.
Our framework achieves 63.5% mAP and 80.3% CorLoc with 5 categories overlapping between two datasets.
arXiv Detail & Related papers (2021-08-17T07:59:34Z) - Towards Cross-Granularity Few-Shot Learning: Coarse-to-Fine
Pseudo-Labeling with Visual-Semantic Meta-Embedding [13.063136901934865]
Few-shot learning aims at rapidly adapting to novel categories with only a handful of samples at test time.
In this paper, we advance the few-shot classification paradigm towards a more challenging scenario, i.e., cross-granularity few-shot classification.
We approximate the fine-grained data distribution by greedy clustering of each coarse-class into pseudo-fine-classes according to the similarity of image embeddings.
arXiv Detail & Related papers (2020-07-11T03:44:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.