Unseen Classes at a Later Time? No Problem
- URL: http://arxiv.org/abs/2203.16517v1
- Date: Wed, 30 Mar 2022 17:52:16 GMT
- Title: Unseen Classes at a Later Time? No Problem
- Authors: Hari Chandana Kuchibhotla, Sumitra S Malagi, Shivam Chandhok, Vineeth
N Balasubramanian
- Abstract summary: We propose a new Online-CGZSL setting which is more practical and flexible.
We introduce a unified feature-generative framework for CGZSL that leverages bi-directional incremental alignment to dynamically adapt to addition of new classes, with or without labeled data, that arrive over time in any of these CGZSL settings.
- Score: 17.254973125515402
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent progress towards learning from limited supervision has encouraged
efforts towards designing models that can recognize novel classes at test time
(generalized zero-shot learning or GZSL). GZSL approaches assume knowledge of
all classes, with or without labeled data, beforehand. However, practical
scenarios demand models that are adaptable and can handle dynamic addition of
new seen and unseen classes on the fly (that is continual generalized zero-shot
learning or CGZSL). One solution is to sequentially retrain and reuse
conventional GZSL methods, however, such an approach suffers from catastrophic
forgetting leading to suboptimal generalization performance. A few recent
efforts towards tackling CGZSL have been limited by difference in settings,
practicality, data splits and protocols followed-inhibiting fair comparison and
a clear direction forward. Motivated from these observations, in this work, we
firstly consolidate the different CGZSL setting variants and propose a new
Online-CGZSL setting which is more practical and flexible. Secondly, we
introduce a unified feature-generative framework for CGZSL that leverages
bi-directional incremental alignment to dynamically adapt to addition of new
classes, with or without labeled data, that arrive over time in any of these
CGZSL settings. Our comprehensive experiments and analysis on five benchmark
datasets and comparison with baselines show that our approach consistently
outperforms existing methods, especially on the more practical Online setting.
Related papers
- Erasing the Bias: Fine-Tuning Foundation Models for Semi-Supervised Learning [4.137391543972184]
Semi-supervised learning (SSL) has witnessed remarkable progress, resulting in numerous method variations.
In this paper, we present a novel SSL approach named FineSSL that significantly addresses this limitation by adapting pre-trained foundation models.
We demonstrate that FineSSL sets a new state of the art for SSL on multiple benchmark datasets, reduces the training cost by over six times, and can seamlessly integrate various fine-tuning and modern SSL algorithms.
arXiv Detail & Related papers (2024-05-20T03:33:12Z) - Zero-Shot Logit Adjustment [89.68803484284408]
Generalized Zero-Shot Learning (GZSL) is a semantic-descriptor-based learning technique.
In this paper, we propose a new generation-based technique to enhance the generator's effect while neglecting the improvement of the classifier.
Our experiments demonstrate that the proposed technique achieves state-of-the-art when combined with the basic generator, and it can improve various generative zero-shot learning frameworks.
arXiv Detail & Related papers (2022-04-25T17:54:55Z) - Learning What Not to Segment: A New Perspective on Few-Shot Segmentation [63.910211095033596]
Recently few-shot segmentation (FSS) has been extensively developed.
This paper proposes a fresh and straightforward insight to alleviate the problem.
In light of the unique nature of the proposed approach, we also extend it to a more realistic but challenging setting.
arXiv Detail & Related papers (2022-03-15T03:08:27Z) - A Strong Baseline for Semi-Supervised Incremental Few-Shot Learning [54.617688468341704]
Few-shot learning aims to learn models that generalize to novel classes with limited training samples.
We propose a novel paradigm containing two parts: (1) a well-designed meta-training algorithm for mitigating ambiguity between base and novel classes caused by unreliable pseudo labels and (2) a model adaptation mechanism to learn discriminative features for novel classes while preserving base knowledge using few labeled and all the unlabeled data.
arXiv Detail & Related papers (2021-10-21T13:25:52Z) - FREE: Feature Refinement for Generalized Zero-Shot Learning [86.41074134041394]
Generalized zero-shot learning (GZSL) has achieved significant progress, with many efforts dedicated to overcoming the problems of visual-semantic domain gap and seen-unseen bias.
Most existing methods directly use feature extraction models trained on ImageNet alone, ignoring the cross-dataset bias between ImageNet and GZSL benchmarks.
We propose a simple yet effective GZSL method, termed feature refinement for generalized zero-shot learning (FREE) to tackle the above problem.
arXiv Detail & Related papers (2021-07-29T08:11:01Z) - Contrastive Embedding for Generalized Zero-Shot Learning [22.050109158293402]
Generalized zero-shot learning (GZSL) aims to recognize objects from both seen and unseen classes.
Recent feature generation methods learn a generative model that can synthesize the missing visual features of unseen classes.
We propose to integrate the generation model with the embedding model, yielding a hybrid GZSL framework.
arXiv Detail & Related papers (2021-03-30T08:54:03Z) - Meta-Learned Attribute Self-Gating for Continual Generalized Zero-Shot
Learning [82.07273754143547]
We propose a meta-continual zero-shot learning (MCZSL) approach to generalizing a model to categories unseen during training.
By pairing self-gating of attributes and scaled class normalization with meta-learning based training, we are able to outperform state-of-the-art results.
arXiv Detail & Related papers (2021-02-23T18:36:14Z) - Generative Replay-based Continual Zero-Shot Learning [7.909034037183046]
We develop a generative replay-based continual ZSL (GRCZSL)
The proposed method endows traditional ZSL to learn from streaming data and acquire new knowledge without forgetting the previous tasks' experience.
The proposed GRZSL method is developed for a single-head setting of continual learning, simulating a real-world problem setting.
arXiv Detail & Related papers (2021-01-22T00:03:34Z) - On Data-Augmentation and Consistency-Based Semi-Supervised Learning [77.57285768500225]
Recently proposed consistency-based Semi-Supervised Learning (SSL) methods have advanced the state of the art in several SSL tasks.
Despite these advances, the understanding of these methods is still relatively limited.
arXiv Detail & Related papers (2021-01-18T10:12:31Z) - Generalized Continual Zero-Shot Learning [7.097782028036196]
zero-shot learning (ZSL) aims to classify unseen classes by transferring the knowledge from seen classes to unseen classes based on the class description.
We propose a more general and practical setup for ZSL, where classes arrive sequentially in the form of a task.
We use knowledge distillation and storing and replay the few samples from previous tasks using a small episodic memory.
arXiv Detail & Related papers (2020-11-17T08:47:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.