Semantic Enhanced Knowledge Graph for Large-Scale Zero-Shot Learning
- URL: http://arxiv.org/abs/2212.13151v1
- Date: Mon, 26 Dec 2022 13:18:36 GMT
- Title: Semantic Enhanced Knowledge Graph for Large-Scale Zero-Shot Learning
- Authors: Jiwei Wei, Yang Yang, Zeyu Ma, Jingjing Li, Xing Xu, Heng Tao Shen
- Abstract summary: We provide a new semantic enhanced knowledge graph that contains both expert knowledge and categories semantic correlation.
To propagate information on the knowledge graph, we propose a novel Residual Graph Convolutional Network (ResGCN)
Experiments conducted on the widely used large-scale ImageNet-21K dataset and AWA2 dataset show the effectiveness of our method.
- Score: 74.6485604326913
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Zero-Shot Learning has been a highlighted research topic in both vision and
language areas. Recently, most existing methods adopt structured knowledge
information to model explicit correlations among categories and use deep graph
convolutional network to propagate information between different categories.
However, it is difficult to add new categories to existing structured knowledge
graph, and deep graph convolutional network suffers from over-smoothing
problem. In this paper, we provide a new semantic enhanced knowledge graph that
contains both expert knowledge and categories semantic correlation. Our
semantic enhanced knowledge graph can further enhance the correlations among
categories and make it easy to absorb new categories. To propagate information
on the knowledge graph, we propose a novel Residual Graph Convolutional Network
(ResGCN), which can effectively alleviate the problem of over-smoothing.
Experiments conducted on the widely used large-scale ImageNet-21K dataset and
AWA2 dataset show the effectiveness of our method, and establish a new
state-of-the-art on zero-shot learning. Moreover, our results on the
large-scale ImageNet-21K with various feature extraction networks show that our
method has better generalization and robustness.
Related papers
- Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - A Comprehensive Survey on Deep Graph Representation Learning [26.24869157855632]
Graph representation learning aims to encode high-dimensional sparse graph-structured data into low-dimensional dense vectors.
Traditional methods have limited model capacity which limits the learning performance.
Deep graph representation learning has shown great potential and advantages over shallow (traditional) methods.
arXiv Detail & Related papers (2023-04-11T08:23:52Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - InfoGCL: Information-Aware Graph Contrastive Learning [26.683911257080304]
We study how graph information is transformed and transferred during the contrastive learning process.
We propose an information-aware graph contrastive learning framework called InfoGCL.
We show for the first time that all recent graph contrastive learning methods can be unified by our framework.
arXiv Detail & Related papers (2021-10-28T21:10:39Z) - Graph Representation Learning by Ensemble Aggregating Subgraphs via
Mutual Information Maximization [5.419711903307341]
We introduce a self-supervised learning method to enhance the representations of graph-level learned by Graph Neural Networks.
To get a comprehensive understanding of the graph structure, we propose an ensemble-learning like subgraph method.
And to achieve efficient and effective contrasive learning, a Head-Tail contrastive samples construction method is proposed.
arXiv Detail & Related papers (2021-03-24T12:06:12Z) - Model-Agnostic Graph Regularization for Few-Shot Learning [60.64531995451357]
We present a comprehensive study on graph embedded few-shot learning.
We introduce a graph regularization approach that allows a deeper understanding of the impact of incorporating graph information between labels.
Our approach improves the performance of strong base learners by up to 2% on Mini-ImageNet and 6.7% on ImageNet-FS.
arXiv Detail & Related papers (2021-02-14T05:28:13Z) - Multi-Level Graph Convolutional Network with Automatic Graph Learning
for Hyperspectral Image Classification [63.56018768401328]
We propose a Multi-level Graph Convolutional Network (GCN) with Automatic Graph Learning method (MGCN-AGL) for HSI classification.
By employing attention mechanism to characterize the importance among spatially neighboring regions, the most relevant information can be adaptively incorporated to make decisions.
Our MGCN-AGL encodes the long range dependencies among image regions based on the expressive representations that have been produced at local level.
arXiv Detail & Related papers (2020-09-19T09:26:20Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.