Improving Zero Shot Learning Baselines with Commonsense Knowledge
- URL: http://arxiv.org/abs/2012.06236v1
- Date: Fri, 11 Dec 2020 10:52:04 GMT
- Title: Improving Zero Shot Learning Baselines with Commonsense Knowledge
- Authors: Abhinaba Roy, Deepanway Ghosal, Erik Cambria, Navonil Majumder, Rada
Mihalcea, Soujanya Poria
- Abstract summary: We take advantage of explicit relations between nodes defined in ConceptNet, a commonsense knowledge graph, to generate commonsense embeddings of the class labels.
Our experiments performed on three standard benchmark datasets surpass the strong baselines when we fuse our commonsense embeddings with existing semantic embeddings.
- Score: 42.12440091215234
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Zero shot learning -- the problem of training and testing on a completely
disjoint set of classes -- relies greatly on its ability to transfer knowledge
from train classes to test classes. Traditionally semantic embeddings
consisting of human defined attributes (HA) or distributed word embeddings
(DWE) are used to facilitate this transfer by improving the association between
visual and semantic embeddings. In this paper, we take advantage of explicit
relations between nodes defined in ConceptNet, a commonsense knowledge graph,
to generate commonsense embeddings of the class labels by using a graph
convolution network-based autoencoder. Our experiments performed on three
standard benchmark datasets surpass the strong baselines when we fuse our
commonsense embeddings with existing semantic embeddings i.e. HA and DWE.
Related papers
- Description-Enhanced Label Embedding Contrastive Learning for Text
Classification [65.01077813330559]
Self-Supervised Learning (SSL) in model learning process and design a novel self-supervised Relation of Relation (R2) classification task.
Relation of Relation Learning Network (R2-Net) for text classification, in which text classification and R2 classification are treated as optimization targets.
external knowledge from WordNet to obtain multi-aspect descriptions for label semantic learning.
arXiv Detail & Related papers (2023-06-15T02:19:34Z) - Supervised Knowledge May Hurt Novel Class Discovery Performance [13.31397670697559]
Novel class discovery (NCD) aims to infer novel categories in an unlabeled dataset by leveraging prior knowledge of a labeled set comprising disjoint but related classes.
This paper considers the question: Is supervised knowledge always helpful at different levels of semantic relevance?
arXiv Detail & Related papers (2023-06-06T13:04:05Z) - CODER: Coupled Diversity-Sensitive Momentum Contrastive Learning for
Image-Text Retrieval [108.48540976175457]
We propose Coupled Diversity-Sensitive Momentum Constrastive Learning (CODER) for improving cross-modal representation.
We introduce dynamic dictionaries for both modalities to enlarge the scale of image-text pairs, and diversity-sensitiveness is achieved by adaptive negative pair weighting.
Experiments conducted on two popular benchmarks, i.e. MSCOCO and Flicker30K, validate CODER remarkably outperforms the state-of-the-art approaches.
arXiv Detail & Related papers (2022-08-21T08:37:50Z) - An Adversarial Transfer Network for Knowledge Representation Learning [11.013390624382257]
We propose an adversarial embedding transfer network ATransN, which transfers knowledge from one or more teacher knowledge graphs to a target one.
Specifically, we add soft constraints on aligned entity pairs and neighbours to the existing knowledge representation learning methods.
arXiv Detail & Related papers (2021-04-30T05:07:25Z) - Improved Biomedical Word Embeddings in the Transformer Era [2.978663539080876]
We learn word and concept embeddings by first using the skip-gram method and further fine-tuning them with correlational information.
We conduct evaluations of these tuned static embeddings using multiple datasets for word relatedness developed by previous efforts.
arXiv Detail & Related papers (2020-12-22T03:03:50Z) - ALICE: Active Learning with Contrastive Natural Language Explanations [69.03658685761538]
We propose Active Learning with Contrastive Explanations (ALICE) to improve data efficiency in learning.
ALICE learns to first use active learning to select the most informative pairs of label classes to elicit contrastive natural language explanations.
It extracts knowledge from these explanations using a semantically extracted knowledge.
arXiv Detail & Related papers (2020-09-22T01:02:07Z) - Knowledge-Guided Multi-Label Few-Shot Learning for General Image
Recognition [75.44233392355711]
KGGR framework exploits prior knowledge of statistical label correlations with deep neural networks.
It first builds a structured knowledge graph to correlate different labels based on statistical label co-occurrence.
Then, it introduces the label semantics to guide learning semantic-specific features.
It exploits a graph propagation network to explore graph node interactions.
arXiv Detail & Related papers (2020-09-20T15:05:29Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.