Understanding Graph Convolutional Networks for Text Classification
- URL: http://arxiv.org/abs/2203.16060v1
- Date: Wed, 30 Mar 2022 05:14:31 GMT
- Title: Understanding Graph Convolutional Networks for Text Classification
- Authors: Soyeon Caren Han, Zihan Yuan, Kunze Wang, Siqu Long, Josiah Poon
- Abstract summary: We conduct a comprehensive analysis of the role of node and edge embeddings in a graph and its GCN learning techniques in text classification.
Our analysis is the first of its kind and provides useful insights into the importance of each graph node/edge construction mechanism.
- Score: 9.495731689143827
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Convolutional Networks (GCN) have been effective at tasks that have
rich relational structure and can preserve global structure information of a
dataset in graph embeddings. Recently, many researchers focused on examining
whether GCNs could handle different Natural Language Processing tasks,
especially text classification. While applying GCNs to text classification is
well-studied, its graph construction techniques, such as node/edge selection
and their feature representation, and the optimal GCN learning mechanism in
text classification is rather neglected. In this paper, we conduct a
comprehensive analysis of the role of node and edge embeddings in a graph and
its GCN learning techniques in text classification. Our analysis is the first
of its kind and provides useful insights into the importance of each graph
node/edge construction mechanism when applied at the GCN training/testing in
different text classification benchmarks, as well as under its semi-supervised
environment.
Related papers
- DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Learnable Structural Semantic Readout for Graph Classification [23.78861906423389]
We propose structural semantic readout (SSRead) to summarize the node representations at the position-level.
SSRead aims to identify structurally-meaningful positions by using the semantic alignment between its nodes and structural prototypes.
Our experimental results demonstrate that SSRead significantly improves the classification performance and interpretability of GNN classifiers.
arXiv Detail & Related papers (2021-11-22T20:44:27Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Hierarchical Graph Capsule Network [78.4325268572233]
We propose hierarchical graph capsule network (HGCN) that can jointly learn node embeddings and extract graph hierarchies.
To learn the hierarchical representation, HGCN characterizes the part-whole relationship between lower-level capsules (part) and higher-level capsules (whole)
arXiv Detail & Related papers (2020-12-16T04:13:26Z) - Multi-Level Graph Convolutional Network with Automatic Graph Learning
for Hyperspectral Image Classification [63.56018768401328]
We propose a Multi-level Graph Convolutional Network (GCN) with Automatic Graph Learning method (MGCN-AGL) for HSI classification.
By employing attention mechanism to characterize the importance among spatially neighboring regions, the most relevant information can be adaptively incorporated to make decisions.
Our MGCN-AGL encodes the long range dependencies among image regions based on the expressive representations that have been produced at local level.
arXiv Detail & Related papers (2020-09-19T09:26:20Z) - AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [85.0332394224503]
We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
arXiv Detail & Related papers (2020-07-05T08:16:03Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Knowledge Embedding Based Graph Convolutional Network [35.35776808660919]
This paper proposes a novel framework, namely the Knowledge Embedding based Graph Convolutional Network (KE-GCN)
KE-GCN combines the power of Graph Convolutional Network (GCN) in graph-based belief propagation and the strengths of advanced knowledge embedding methods.
Our theoretical analysis shows that KE-GCN offers an elegant unification of several well-known GCN methods as specific cases.
arXiv Detail & Related papers (2020-06-12T17:12:51Z) - Every Document Owns Its Structure: Inductive Text Classification via
Graph Neural Networks [22.91359631452695]
We propose TextING for inductive text classification via Graph Neural Networks (GNN)
We first build individual graphs for each document and then use GNN to learn the fine-grained word representations based on their local structures.
Our method outperforms state-of-the-art text classification methods.
arXiv Detail & Related papers (2020-04-22T07:23:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.