NEV-NCD: Negative Learning, Entropy, and Variance regularization based
novel action categories discovery
- URL: http://arxiv.org/abs/2304.07354v1
- Date: Fri, 14 Apr 2023 19:20:26 GMT
- Title: NEV-NCD: Negative Learning, Entropy, and Variance regularization based
novel action categories discovery
- Authors: Zahid Hasan, Masud Ahmed, Abu Zaher Md Faridee, Sanjay Purushotham,
Heesung Kwon, Hyungtae Lee, Nirmalya Roy
- Abstract summary: Novel Categories Discovery (NCD) facilitates learning from a partially annotated label space.
We propose a novel single-stage joint optimization-based NCD method, Negative learning, Entropy, and Variance regularization NCD.
We demonstrate the efficacy of NEV-NCD in previously unexplored NCD applications of video action recognition.
- Score: 23.17093125627668
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Novel Categories Discovery (NCD) facilitates learning from a partially
annotated label space and enables deep learning (DL) models to operate in an
open-world setting by identifying and differentiating instances of novel
classes based on the labeled data notions. One of the primary assumptions of
NCD is that the novel label space is perfectly disjoint and can be
equipartitioned, but it is rarely realized by most NCD approaches in practice.
To better align with this assumption, we propose a novel single-stage joint
optimization-based NCD method, Negative learning, Entropy, and Variance
regularization NCD (NEV-NCD). We demonstrate the efficacy of NEV-NCD in
previously unexplored NCD applications of video action recognition (VAR) with
the public UCF101 dataset and a curated in-house partial action-space annotated
multi-view video dataset. We perform a thorough ablation study by varying the
composition of final joint loss and associated hyper-parameters. During our
experiments with UCF101 and multi-view action dataset, NEV-NCD achieves ~ 83%
classification accuracy in test instances of labeled data. NEV-NCD achieves ~
70% clustering accuracy over unlabeled data outperforming both naive baselines
(by ~ 40%) and state-of-the-art pseudo-labeling-based approaches (by ~ 3.5%)
over both datasets. Further, we propose to incorporate optional view-invariant
feature learning with the multiview dataset to identify novel categories from
novel viewpoints. Our additional view-invariance constraint improves the
discriminative accuracy for both known and unknown categories by ~ 10% for
novel viewpoints.
Related papers
- Exclusive Style Removal for Cross Domain Novel Class Discovery [15.868889486516306]
Novel Class Discovery (NCD) is a promising field in open-world learning.
We introduce an exclusive style removal module for extracting style information that is distinctive from the baseline features.
This module is easy to integrate with other NCD methods, acting as a plug-in to improve performance on novel classes with different distributions.
arXiv Detail & Related papers (2024-06-26T07:44:27Z) - Federated Continual Novel Class Learning [68.05835753892907]
We propose a Global Alignment Learning framework that can accurately estimate the global novel class number.
Gal achieves significant improvements in novel-class performance, increasing the accuracy by 5.1% to 10.6%.
Gal is shown to be effective in equipping a variety of different mainstream Federated Learning algorithms with novel class discovery and learning capability.
arXiv Detail & Related papers (2023-12-21T00:31:54Z) - A Practical Approach to Novel Class Discovery in Tabular Data [38.41548083078336]
Novel Class Discovery (NCD) is a problem of extracting knowledge from a labeled set of known classes to accurately partition an unlabeled set of novel classes.
In this work, we propose to tune the hyper parameters of NCD methods by adapting the $k$-fold cross-validation process and hiding some of the known classes in each fold.
We find that the latent space of this method can be used to reliably estimate the number of novel classes.
arXiv Detail & Related papers (2023-11-09T15:24:44Z) - Dynamic Conceptional Contrastive Learning for Generalized Category
Discovery [76.82327473338734]
Generalized category discovery (GCD) aims to automatically cluster partially labeled data.
Unlabeled data contain instances that are not only from known categories of the labeled data but also from novel categories.
One effective way for GCD is applying self-supervised learning to learn discriminate representation for unlabeled data.
We propose a Dynamic Conceptional Contrastive Learning framework, which can effectively improve clustering accuracy.
arXiv Detail & Related papers (2023-03-30T14:04:39Z) - Large-scale Pre-trained Models are Surprisingly Strong in Incremental Novel Class Discovery [76.63807209414789]
We challenge the status quo in class-iNCD and propose a learning paradigm where class discovery occurs continuously and truly unsupervisedly.
We propose simple baselines, composed of a frozen PTM backbone and a learnable linear classifier, that are not only simple to implement but also resilient under longer learning scenarios.
arXiv Detail & Related papers (2023-03-28T13:47:16Z) - PromptCAL: Contrastive Affinity Learning via Auxiliary Prompts for
Generalized Novel Category Discovery [39.03732147384566]
Generalized Novel Category Discovery (GNCD) setting aims to categorize unlabeled training data coming from known and novel classes.
We propose Contrastive Affinity Learning method with auxiliary visual Prompts, dubbed PromptCAL, to address this challenging problem.
Our approach discovers reliable pairwise sample affinities to learn better semantic clustering of both known and novel classes for the class token and visual prompts.
arXiv Detail & Related papers (2022-12-11T20:06:14Z) - Modeling Inter-Class and Intra-Class Constraints in Novel Class
Discovery [20.67503042774617]
Novel class discovery (NCD) aims at learning a model that transfers the common knowledge from a class-disjoint labelled dataset to another unlabelled dataset.
We propose to model both inter-class and intra-class constraints in NCD based on the symmetric Kullback-Leibler divergence (sKLD)
arXiv Detail & Related papers (2022-10-07T14:46:32Z) - A Method for Discovering Novel Classes in Tabular Data [54.11148718494725]
In Novel Class Discovery (NCD), the goal is to find new classes in an unlabeled set given a labeled set of known but different classes.
We show a way to extract knowledge from already known classes to guide the discovery process of novel classes in heterogeneous data.
arXiv Detail & Related papers (2022-09-02T11:45:24Z) - Spacing Loss for Discovering Novel Categories [72.52222295216062]
Novel Class Discovery (NCD) is a learning paradigm, where a machine learning model is tasked to semantically group instances from unlabeled data.
We first characterize existing NCD approaches into single-stage and two-stage methods based on whether they require access to labeled and unlabeled data together.
We devise a simple yet powerful loss function that enforces separability in the latent space using cues from multi-dimensional scaling.
arXiv Detail & Related papers (2022-04-22T09:37:11Z) - Neighborhood Contrastive Learning for Novel Class Discovery [79.14767688903028]
We build a new framework, named Neighborhood Contrastive Learning, to learn discriminative representations that are important to clustering performance.
We experimentally demonstrate that these two ingredients significantly contribute to clustering performance and lead our model to outperform state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2021-06-20T17:34:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.