DiRaC-I: Identifying Diverse and Rare Training Classes for Zero-Shot
Learning
- URL: http://arxiv.org/abs/2301.00236v1
- Date: Sat, 31 Dec 2022 16:05:09 GMT
- Title: DiRaC-I: Identifying Diverse and Rare Training Classes for Zero-Shot
Learning
- Authors: Sandipan Sarma, Arijit Sur
- Abstract summary: It is intuitive that intelligently selecting the training classes from a dataset for Zero-Shot Learning (ZSL) can improve the performance of existing ZSL methods.
We propose a framework called Diverse and Rare Class Identifier (DiRaC-I) which, given an attribute-based dataset, can intelligently yield the most suitable "seen classes" for training ZSL models.
Our results demonstrate DiRaC-I helps ZSL models to achieve significant classification accuracy improvements.
- Score: 6.75714270653184
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Inspired by strategies like Active Learning, it is intuitive that
intelligently selecting the training classes from a dataset for Zero-Shot
Learning (ZSL) can improve the performance of existing ZSL methods. In this
work, we propose a framework called Diverse and Rare Class Identifier (DiRaC-I)
which, given an attribute-based dataset, can intelligently yield the most
suitable "seen classes" for training ZSL models. DiRaC-I has two main goals -
constructing a diversified set of seed classes, followed by a visual-semantic
mining algorithm initialized by these seed classes that acquires the classes
capturing both diversity and rarity in the object domain adequately. These
classes can then be used as "seen classes" to train ZSL models for image
classification. We adopt a real-world scenario where novel object classes are
available to neither DiRaC-I nor the ZSL models during training and conducted
extensive experiments on two benchmark data sets for zero-shot image
classification - CUB and SUN. Our results demonstrate DiRaC-I helps ZSL models
to achieve significant classification accuracy improvements.
Related papers
- Targeted Attention for Generalized- and Zero-Shot Learning [0.0]
The Zero-Shot Learning (ZSL) task attempts to learn concepts without any labeled data.
We show state-of-the-art results in the Generalized Zero-Shot Learning (GZSL) setting, with Harmonic Mean R-1 of 66.14% on the CUB200 dataset.
arXiv Detail & Related papers (2022-11-17T03:55:18Z) - Discriminative Region-based Multi-Label Zero-Shot Learning [145.0952336375342]
Multi-label zero-shot learning (ZSL) is a more realistic counter-part of standard single-label ZSL.
We propose an alternate approach towards region-based discriminability-preserving ZSL.
arXiv Detail & Related papers (2021-08-20T17:56:47Z) - Zero-shot Learning with Class Description Regularization [10.739164530098755]
We introduce a novel form of regularization that encourages generative ZSL models to pay more attention to the description of each category.
Our empirical results demonstrate improvements over the performance of multiple state-of-the-art models on the task of generalized zero-shot recognition and classification.
arXiv Detail & Related papers (2021-06-30T14:56:15Z) - Attribute-Modulated Generative Meta Learning for Zero-Shot
Classification [52.64680991682722]
We present the Attribute-Modulated generAtive meta-model for Zero-shot learning (AMAZ)
Our model consists of an attribute-aware modulation network and an attribute-augmented generative network.
Our empirical evaluations show that AMAZ improves state-of-the-art methods by 3.8% and 5.1% in ZSL and generalized ZSL settings, respectively.
arXiv Detail & Related papers (2021-04-22T04:16:43Z) - Task Aligned Generative Meta-learning for Zero-shot Learning [64.16125851588437]
We propose a Task-aligned Generative Meta-learning model for Zero-shot learning (TGMZ)
TGMZ mitigates the potentially biased training and enables meta-ZSL to accommodate real-world datasets containing diverse distributions.
Our comparisons with state-of-the-art algorithms show the improvements of 2.1%, 3.0%, 2.5%, and 7.6% achieved by TGMZ on AWA1, AWA2, CUB, and aPY datasets.
arXiv Detail & Related papers (2021-03-03T05:18:36Z) - Meta-Learned Attribute Self-Gating for Continual Generalized Zero-Shot
Learning [82.07273754143547]
We propose a meta-continual zero-shot learning (MCZSL) approach to generalizing a model to categories unseen during training.
By pairing self-gating of attributes and scaled class normalization with meta-learning based training, we are able to outperform state-of-the-art results.
arXiv Detail & Related papers (2021-02-23T18:36:14Z) - OntoZSL: Ontology-enhanced Zero-shot Learning [19.87808305218359]
Key to implementing Zero-shot Learning (ZSL) is to leverage the prior knowledge of classes which builds the semantic relationship between classes.
In this paper, we explore richer and more competitive prior knowledge to model the inter-class relationship for ZSL.
To address the data imbalance between seen classes and unseen classes, we developed a generative ZSL framework with Generative Adversarial Networks (GANs)
arXiv Detail & Related papers (2021-02-15T04:39:58Z) - Isometric Propagation Network for Generalized Zero-shot Learning [72.02404519815663]
A popular strategy is to learn a mapping between the semantic space of class attributes and the visual space of images based on the seen classes and their data.
We propose Isometric propagation Network (IPN), which learns to strengthen the relation between classes within each space and align the class dependency in the two spaces.
IPN achieves state-of-the-art performance on three popular Zero-shot learning benchmarks.
arXiv Detail & Related papers (2021-02-03T12:45:38Z) - A Multi-class Approach -- Building a Visual Classifier based on Textual
Descriptions using Zero-Shot Learning [0.34265828682659694]
We overcome the two main hurdles of Machine Learning, i.e. scarcity of data and constrained prediction of the classification model.
We train a classifier by mapping labelled images to their textual description instead of training it for specific classes.
arXiv Detail & Related papers (2020-11-18T12:06:55Z) - Generalized Continual Zero-Shot Learning [7.097782028036196]
zero-shot learning (ZSL) aims to classify unseen classes by transferring the knowledge from seen classes to unseen classes based on the class description.
We propose a more general and practical setup for ZSL, where classes arrive sequentially in the form of a task.
We use knowledge distillation and storing and replay the few samples from previous tasks using a small episodic memory.
arXiv Detail & Related papers (2020-11-17T08:47:54Z) - Attribute Propagation Network for Graph Zero-shot Learning [57.68486382473194]
We introduce the attribute propagation network (APNet), which is composed of 1) a graph propagation model generating attribute vector for each class and 2) a parameterized nearest neighbor (NN) classifier.
APNet achieves either compelling performance or new state-of-the-art results in experiments with two zero-shot learning settings and five benchmark datasets.
arXiv Detail & Related papers (2020-09-24T16:53:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.