Modeling Inter-Class and Intra-Class Constraints in Novel Class
Discovery
- URL: http://arxiv.org/abs/2210.03591v3
- Date: Thu, 23 Mar 2023 13:15:27 GMT
- Title: Modeling Inter-Class and Intra-Class Constraints in Novel Class
Discovery
- Authors: Wenbin Li, Zhichen Fan, Jing Huo, Yang Gao
- Abstract summary: Novel class discovery (NCD) aims at learning a model that transfers the common knowledge from a class-disjoint labelled dataset to another unlabelled dataset.
We propose to model both inter-class and intra-class constraints in NCD based on the symmetric Kullback-Leibler divergence (sKLD)
- Score: 20.67503042774617
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Novel class discovery (NCD) aims at learning a model that transfers the
common knowledge from a class-disjoint labelled dataset to another unlabelled
dataset and discovers new classes (clusters) within it. Many methods, as well
as elaborate training pipelines and appropriate objectives, have been proposed
and considerably boosted performance on NCD tasks. Despite all this, we find
that the existing methods do not sufficiently take advantage of the essence of
the NCD setting. To this end, in this paper, we propose to model both
inter-class and intra-class constraints in NCD based on the symmetric
Kullback-Leibler divergence (sKLD). Specifically, we propose an inter-class
sKLD constraint to effectively exploit the disjoint relationship between
labelled and unlabelled classes, enforcing the separability for different
classes in the embedding space. In addition, we present an intra-class sKLD
constraint to explicitly constrain the intra-relationship between a sample and
its augmentations and ensure the stability of the training process at the same
time. We conduct extensive experiments on the popular CIFAR10, CIFAR100 and
ImageNet benchmarks and successfully demonstrate that our method can establish
a new state of the art and can achieve significant performance improvements,
e.g., 3.5%/3.7% clustering accuracy improvements on CIFAR100-50 dataset split
under the task-aware/-agnostic evaluation protocol, over previous
state-of-the-art methods. Code is available at
https://github.com/FanZhichen/NCD-IIC.
Related papers
- SMILe: Leveraging Submodular Mutual Information For Robust Few-Shot Object Detection [2.0755366440393743]
Confusion and forgetting of object classes have been challenges of prime interest in Few-Shot Object Detection (FSOD)
We introduce a novel Submodular Mutual Information Learning framework which adopts mutual information functions.
Our proposed approach generalizes to several existing approaches in FSOD, agnostic of the backbone architecture.
arXiv Detail & Related papers (2024-07-02T20:53:43Z) - Exclusive Style Removal for Cross Domain Novel Class Discovery [15.868889486516306]
Novel Class Discovery (NCD) is a promising field in open-world learning.
We introduce an exclusive style removal module for extracting style information that is distinctive from the baseline features.
This module is easy to integrate with other NCD methods, acting as a plug-in to improve performance on novel classes with different distributions.
arXiv Detail & Related papers (2024-06-26T07:44:27Z) - Few-shot Tuning of Foundation Models for Class-incremental Learning [19.165004570789755]
We propose a new approach to continually tune foundation models for new classes in few-shot settings.
CoACT shows up to 13.5% improvement in standard FSCIL over the current SOTA on benchmark evaluations.
arXiv Detail & Related papers (2024-05-26T16:41:03Z) - Towards Continual Learning Desiderata via HSIC-Bottleneck
Orthogonalization and Equiangular Embedding [55.107555305760954]
We propose a conceptually simple yet effective method that attributes forgetting to layer-wise parameter overwriting and the resulting decision boundary distortion.
Our method achieves competitive accuracy performance, even with absolute superiority of zero exemplar buffer and 1.02x the base model.
arXiv Detail & Related papers (2024-01-17T09:01:29Z) - NEV-NCD: Negative Learning, Entropy, and Variance regularization based
novel action categories discovery [23.17093125627668]
Novel Categories Discovery (NCD) facilitates learning from a partially annotated label space.
We propose a novel single-stage joint optimization-based NCD method, Negative learning, Entropy, and Variance regularization NCD.
We demonstrate the efficacy of NEV-NCD in previously unexplored NCD applications of video action recognition.
arXiv Detail & Related papers (2023-04-14T19:20:26Z) - Large-scale Pre-trained Models are Surprisingly Strong in Incremental Novel Class Discovery [76.63807209414789]
We challenge the status quo in class-iNCD and propose a learning paradigm where class discovery occurs continuously and truly unsupervisedly.
We propose simple baselines, composed of a frozen PTM backbone and a learnable linear classifier, that are not only simple to implement but also resilient under longer learning scenarios.
arXiv Detail & Related papers (2023-03-28T13:47:16Z) - Rethinking Clustering-Based Pseudo-Labeling for Unsupervised
Meta-Learning [146.11600461034746]
Method for unsupervised meta-learning, CACTUs, is a clustering-based approach with pseudo-labeling.
This approach is model-agnostic and can be combined with supervised algorithms to learn from unlabeled data.
We prove that the core reason for this is lack of a clustering-friendly property in the embedding space.
arXiv Detail & Related papers (2022-09-27T19:04:36Z) - Spacing Loss for Discovering Novel Categories [72.52222295216062]
Novel Class Discovery (NCD) is a learning paradigm, where a machine learning model is tasked to semantically group instances from unlabeled data.
We first characterize existing NCD approaches into single-stage and two-stage methods based on whether they require access to labeled and unlabeled data together.
We devise a simple yet powerful loss function that enforces separability in the latent space using cues from multi-dimensional scaling.
arXiv Detail & Related papers (2022-04-22T09:37:11Z) - Novel Class Discovery in Semantic Segmentation [104.30729847367104]
We introduce a new setting of Novel Class Discovery in Semantic (NCDSS)
It aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes.
In NCDSS, we need to distinguish the objects and background, and to handle the existence of multiple classes within an image.
We propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels.
arXiv Detail & Related papers (2021-12-03T13:31:59Z) - Boosting the Generalization Capability in Cross-Domain Few-shot Learning
via Noise-enhanced Supervised Autoencoder [23.860842627883187]
We teach the model to capture broader variations of the feature distributions with a novel noise-enhanced supervised autoencoder (NSAE)
NSAE trains the model by jointly reconstructing inputs and predicting the labels of inputs as well as their reconstructed pairs.
We also take advantage of NSAE structure and propose a two-step fine-tuning procedure that achieves better adaption and improves classification performance in the target domain.
arXiv Detail & Related papers (2021-08-11T04:45:56Z) - Generalized Zero-Shot Learning Via Over-Complete Distribution [79.5140590952889]
We propose to generate an Over-Complete Distribution (OCD) using Conditional Variational Autoencoder (CVAE) of both seen and unseen classes.
The effectiveness of the framework is evaluated using both Zero-Shot Learning and Generalized Zero-Shot Learning protocols.
arXiv Detail & Related papers (2020-04-01T19:05:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.