Generalized Continual Zero-Shot Learning
- URL: http://arxiv.org/abs/2011.08508v3
- Date: Mon, 1 Feb 2021 02:28:33 GMT
- Title: Generalized Continual Zero-Shot Learning
- Authors: Chandan Gautam, Sethupathy Parameswaran, Ashish Mishra, Suresh
Sundaram
- Abstract summary: zero-shot learning (ZSL) aims to classify unseen classes by transferring the knowledge from seen classes to unseen classes based on the class description.
We propose a more general and practical setup for ZSL, where classes arrive sequentially in the form of a task.
We use knowledge distillation and storing and replay the few samples from previous tasks using a small episodic memory.
- Score: 7.097782028036196
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, zero-shot learning (ZSL) emerged as an exciting topic and attracted
a lot of attention. ZSL aims to classify unseen classes by transferring the
knowledge from seen classes to unseen classes based on the class description.
Despite showing promising performance, ZSL approaches assume that the training
samples from all seen classes are available during the training, which is
practically not feasible. To address this issue, we propose a more generalized
and practical setup for ZSL, i.e., continual ZSL (CZSL), where classes arrive
sequentially in the form of a task and it actively learns from the changing
environment by leveraging the past experience. Further, to enhance the
reliability, we develop CZSL for a single head continual learning setting where
task identity is revealed during the training process but not during the
testing. To avoid catastrophic forgetting and intransigence, we use knowledge
distillation and storing and replay the few samples from previous tasks using a
small episodic memory. We develop baselines and evaluate generalized CZSL on
five ZSL benchmark datasets for two different settings of continual learning:
with and without class incremental. Moreover, CZSL is developed for two types
of variational autoencoders, which generates two types of features for
classification: (i) generated features at output space and (ii) generated
discriminative features at the latent space. The experimental results clearly
indicate the single head CZSL is more generalizable and suitable for practical
applications.
Related papers
- A Survey on Self-supervised Learning: Algorithms, Applications, and Future Trends [82.64268080902742]
Self-supervised learning (SSL) aims to learn discriminative features from unlabeled data without relying on human-annotated labels.
SSL has garnered significant attention recently, leading to the development of numerous related algorithms.
This paper presents a review of diverse SSL methods, encompassing algorithmic aspects, application domains, three key trends, and open research questions.
arXiv Detail & Related papers (2023-01-13T14:41:05Z) - DiRaC-I: Identifying Diverse and Rare Training Classes for Zero-Shot
Learning [6.75714270653184]
It is intuitive that intelligently selecting the training classes from a dataset for Zero-Shot Learning (ZSL) can improve the performance of existing ZSL methods.
We propose a framework called Diverse and Rare Class Identifier (DiRaC-I) which, given an attribute-based dataset, can intelligently yield the most suitable "seen classes" for training ZSL models.
Our results demonstrate DiRaC-I helps ZSL models to achieve significant classification accuracy improvements.
arXiv Detail & Related papers (2022-12-31T16:05:09Z) - Unseen Classes at a Later Time? No Problem [17.254973125515402]
We propose a new Online-CGZSL setting which is more practical and flexible.
We introduce a unified feature-generative framework for CGZSL that leverages bi-directional incremental alignment to dynamically adapt to addition of new classes, with or without labeled data, that arrive over time in any of these CGZSL settings.
arXiv Detail & Related papers (2022-03-30T17:52:16Z) - A Strong Baseline for Semi-Supervised Incremental Few-Shot Learning [54.617688468341704]
Few-shot learning aims to learn models that generalize to novel classes with limited training samples.
We propose a novel paradigm containing two parts: (1) a well-designed meta-training algorithm for mitigating ambiguity between base and novel classes caused by unreliable pseudo labels and (2) a model adaptation mechanism to learn discriminative features for novel classes while preserving base knowledge using few labeled and all the unlabeled data.
arXiv Detail & Related papers (2021-10-21T13:25:52Z) - Generative Zero-Shot Learning for Semantic Segmentation of 3D Point
Cloud [79.99653758293277]
We present the first generative approach for both Zero-Shot Learning (ZSL) and Generalized ZSL (GZSL) on 3D data.
We show that it reaches or outperforms the state of the art on ModelNet40 classification for both inductive ZSL and inductive GZSL.
Our experiments show that our method outperforms strong baselines, which we additionally propose for this task.
arXiv Detail & Related papers (2021-08-13T13:29:27Z) - Online Lifelong Generalized Zero-Shot Learning [7.909034037183046]
Methods proposed in the literature for zero-shot learning (ZSL) are typically suitable for offline learning and cannot continually learn from sequential streaming data.
This paper proposes a task-free (i.e., task-agnostic) CZSL method, which does not require any task information during continual learning.
arXiv Detail & Related papers (2021-03-19T11:24:05Z) - Meta-Learned Attribute Self-Gating for Continual Generalized Zero-Shot
Learning [82.07273754143547]
We propose a meta-continual zero-shot learning (MCZSL) approach to generalizing a model to categories unseen during training.
By pairing self-gating of attributes and scaled class normalization with meta-learning based training, we are able to outperform state-of-the-art results.
arXiv Detail & Related papers (2021-02-23T18:36:14Z) - OntoZSL: Ontology-enhanced Zero-shot Learning [19.87808305218359]
Key to implementing Zero-shot Learning (ZSL) is to leverage the prior knowledge of classes which builds the semantic relationship between classes.
In this paper, we explore richer and more competitive prior knowledge to model the inter-class relationship for ZSL.
To address the data imbalance between seen classes and unseen classes, we developed a generative ZSL framework with Generative Adversarial Networks (GANs)
arXiv Detail & Related papers (2021-02-15T04:39:58Z) - End-to-end Generative Zero-shot Learning via Few-shot Learning [76.9964261884635]
State-of-the-art approaches to Zero-Shot Learning (ZSL) train generative nets to synthesize examples conditioned on the provided metadata.
We introduce an end-to-end generative ZSL framework that uses such an approach as a backbone and feeds its synthesized output to a Few-Shot Learning algorithm.
arXiv Detail & Related papers (2021-02-08T17:35:37Z) - Generative Replay-based Continual Zero-Shot Learning [7.909034037183046]
We develop a generative replay-based continual ZSL (GRCZSL)
The proposed method endows traditional ZSL to learn from streaming data and acquire new knowledge without forgetting the previous tasks' experience.
The proposed GRZSL method is developed for a single-head setting of continual learning, simulating a real-world problem setting.
arXiv Detail & Related papers (2021-01-22T00:03:34Z) - Information Bottleneck Constrained Latent Bidirectional Embedding for
Zero-Shot Learning [59.58381904522967]
We propose a novel embedding based generative model with a tight visual-semantic coupling constraint.
We learn a unified latent space that calibrates the embedded parametric distributions of both visual and semantic spaces.
Our method can be easily extended to transductive ZSL setting by generating labels for unseen images.
arXiv Detail & Related papers (2020-09-16T03:54:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.