Exclusive Style Removal for Cross Domain Novel Class Discovery
- URL: http://arxiv.org/abs/2406.18140v4
- Date: Tue, 24 Jun 2025 09:03:27 GMT
- Title: Exclusive Style Removal for Cross Domain Novel Class Discovery
- Authors: Yicheng Wang, Feng Liu, Junmin Liu, Kai Sun,
- Abstract summary: textitNovel Class Discovery (NCD) is usually a task to cluster unseen novel classes in an unlabeled set.<n>We introduce an exclusive style removal module for extracting style information that is distinctive from the baseline features.<n>This module is easy to integrate with other NCD methods, acting as a plug-in to improve performance on novel classes with different distributions.
- Score: 15.868889486516306
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As a promising field in open-world learning, \textit{Novel Class Discovery} (NCD) is usually a task to cluster unseen novel classes in an unlabeled set based on the prior knowledge of labeled data within the same domain. However, the performance of existing NCD methods could be severely compromised when novel classes are sampled from a different distribution with the labeled ones. In this paper, we explore and establish the solvability of NCD with cross domain setting under the necessary condition that the style information needs to be removed. Based on the theoretical analysis, we introduce an exclusive style removal module for extracting style information that is distinctive from the baseline features, thereby facilitating inference. Moreover, this module is easy to integrate with other NCD methods, acting as a plug-in to improve performance on novel classes with different distributions compared to the labeled set. Additionally, recognizing the non-negligible influence of different backbones and pre-training strategies on the performance of the NCD methods, we build a fair benchmark for future NCD research. Extensive experiments on three common datasets demonstrate the effectiveness of our proposed style removal strategy.
Related papers
- Generate, Refine, and Encode: Leveraging Synthesized Novel Samples for On-the-Fly Fine-Grained Category Discovery [64.83837781610907]
We investigate the online identification of newly arriving stream data that may belong to both known and unknown categories.<n>Existing OCD methods are devoted to fully mining transferable knowledge from only labeled data.<n>We propose a diffusion-based OCD framework, dubbed DiffGRE, which integrates attribute-composition generation, Refinement, and supervised recognition.
arXiv Detail & Related papers (2025-07-05T14:20:49Z) - NeurNCD: Novel Class Discovery via Implicit Neural Representation [4.498082064000176]
NeurNCD is a versatile and data-efficient framework for novel class discovery.<n>Our framework achieves superior segmentation performance in both open and closed-world settings.<n>Our method significantly outperforms state-of-the-art approaches on the NYUv2 and Replica datasets.
arXiv Detail & Related papers (2025-06-06T16:43:34Z) - Generalized Semantic Contrastive Learning via Embedding Side Information for Few-Shot Object Detection [52.490375806093745]
The objective of few-shot object detection (FSOD) is to detect novel objects with few training samples.
We introduce the side information to alleviate the negative influences derived from the feature space and sample viewpoints.
Our model outperforms the previous state-of-the-art methods, significantly improving the ability of FSOD in most shots/splits.
arXiv Detail & Related papers (2025-04-09T17:24:05Z) - Uncertainty-guided Open-Set Source-Free Unsupervised Domain Adaptation with Target-private Class Segregation [22.474866164542302]
UDA approaches commonly assume that source and target domains share the same labels space.
This paper considers the more challenging Source-Free Open-set Domain Adaptation (SF-OSDA) setting.
We propose a novel approach for SF-OSDA that exploits the granularity of target-private categories by segregating their samples into multiple unknown classes.
arXiv Detail & Related papers (2024-04-16T13:52:00Z) - NEV-NCD: Negative Learning, Entropy, and Variance regularization based
novel action categories discovery [23.17093125627668]
Novel Categories Discovery (NCD) facilitates learning from a partially annotated label space.
We propose a novel single-stage joint optimization-based NCD method, Negative learning, Entropy, and Variance regularization NCD.
We demonstrate the efficacy of NEV-NCD in previously unexplored NCD applications of video action recognition.
arXiv Detail & Related papers (2023-04-14T19:20:26Z) - Large-scale Pre-trained Models are Surprisingly Strong in Incremental Novel Class Discovery [76.63807209414789]
We challenge the status quo in class-iNCD and propose a learning paradigm where class discovery occurs continuously and truly unsupervisedly.
We propose simple baselines, composed of a frozen PTM backbone and a learnable linear classifier, that are not only simple to implement but also resilient under longer learning scenarios.
arXiv Detail & Related papers (2023-03-28T13:47:16Z) - Modeling Inter-Class and Intra-Class Constraints in Novel Class
Discovery [20.67503042774617]
Novel class discovery (NCD) aims at learning a model that transfers the common knowledge from a class-disjoint labelled dataset to another unlabelled dataset.
We propose to model both inter-class and intra-class constraints in NCD based on the symmetric Kullback-Leibler divergence (sKLD)
arXiv Detail & Related papers (2022-10-07T14:46:32Z) - A Method for Discovering Novel Classes in Tabular Data [54.11148718494725]
In Novel Class Discovery (NCD), the goal is to find new classes in an unlabeled set given a labeled set of known but different classes.
We show a way to extract knowledge from already known classes to guide the discovery process of novel classes in heterogeneous data.
arXiv Detail & Related papers (2022-09-02T11:45:24Z) - Class-incremental Novel Class Discovery [76.35226130521758]
We study the new task of class-incremental Novel Class Discovery (class-iNCD)
We propose a novel approach for class-iNCD which prevents forgetting of past information about the base classes.
Our experiments, conducted on three common benchmarks, demonstrate that our method significantly outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2022-07-18T13:49:27Z) - Domain Adaptive Nuclei Instance Segmentation and Classification via
Category-aware Feature Alignment and Pseudo-labelling [65.40672505658213]
We propose a novel deep neural network, namely Category-Aware feature alignment and Pseudo-Labelling Network (CAPL-Net) for UDA nuclei instance segmentation and classification.
Our approach outperforms state-of-the-art UDA methods with a remarkable margin.
arXiv Detail & Related papers (2022-07-04T07:05:06Z) - Spacing Loss for Discovering Novel Categories [72.52222295216062]
Novel Class Discovery (NCD) is a learning paradigm, where a machine learning model is tasked to semantically group instances from unlabeled data.
We first characterize existing NCD approaches into single-stage and two-stage methods based on whether they require access to labeled and unlabeled data together.
We devise a simple yet powerful loss function that enforces separability in the latent space using cues from multi-dimensional scaling.
arXiv Detail & Related papers (2022-04-22T09:37:11Z) - Novel Class Discovery in Semantic Segmentation [104.30729847367104]
We introduce a new setting of Novel Class Discovery in Semantic (NCDSS)
It aims at segmenting unlabeled images containing new classes given prior knowledge from a labeled set of disjoint classes.
In NCDSS, we need to distinguish the objects and background, and to handle the existence of multiple classes within an image.
We propose the Entropy-based Uncertainty Modeling and Self-training (EUMS) framework to overcome noisy pseudo-labels.
arXiv Detail & Related papers (2021-12-03T13:31:59Z) - Meta-FDMixup: Cross-Domain Few-Shot Learning Guided by Labeled Target
Data [95.47859525676246]
A recent study finds that existing few-shot learning methods, trained on the source domain, fail to generalize to the novel target domain when a domain gap is observed.
In this paper, we realize that the labeled target data in Cross-Domain Few-Shot Learning has not been leveraged in any way to help the learning process.
arXiv Detail & Related papers (2021-07-26T06:15:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.