ME-GCN: Multi-dimensional Edge-Embedded Graph Convolutional Networks for
Semi-supervised Text Classification
- URL: http://arxiv.org/abs/2204.04618v1
- Date: Sun, 10 Apr 2022 07:05:12 GMT
- Title: ME-GCN: Multi-dimensional Edge-Embedded Graph Convolutional Networks for
Semi-supervised Text Classification
- Authors: Kunze Wang, Soyeon Caren Han, Siqu Long, Josiah Poon
- Abstract summary: This paper introduces the ME-GCN (Multi-dimensional Edge-enhanced Graph Convolutional Networks) for semi-supervised text classification.
Our proposed model has significantly outperformed the state-of-the-art methods across eight benchmark datasets.
- Score: 6.196387205547024
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Compared to sequential learning models, graph-based neural networks exhibit
excellent ability in capturing global information and have been used for
semi-supervised learning tasks. Most Graph Convolutional Networks are designed
with the single-dimensional edge feature and failed to utilise the rich edge
information about graphs. This paper introduces the ME-GCN (Multi-dimensional
Edge-enhanced Graph Convolutional Networks) for semi-supervised text
classification. A text graph for an entire corpus is firstly constructed to
describe the undirected and multi-dimensional relationship of word-to-word,
document-document, and word-to-document. The graph is initialised with
corpus-trained multi-dimensional word and document node representation, and the
relations are represented according to the distance of those words/documents
nodes. Then, the generated graph is trained with ME-GCN, which considers the
edge features as multi-stream signals, and each stream performs a separate
graph convolutional operation. Our ME-GCN can integrate a rich source of graph
edge information of the entire text corpus. The results have demonstrated that
our proposed model has significantly outperformed the state-of-the-art methods
across eight benchmark datasets.
Related papers
- Graph Neural Networks on Discriminative Graphs of Words [19.817473565906777]
In this work, we explore a new Discriminative Graph of Words Graph Neural Network (DGoW-GNN) approach to classify text.
We propose a new model for the graph-based classification of text, which combines a GNN and a sequence model.
We evaluate our approach on seven benchmark datasets and find that it is outperformed by several state-of-the-art baseline models.
arXiv Detail & Related papers (2024-10-27T15:14:06Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Word Grounded Graph Convolutional Network [24.6338889954789]
Graph Convolutional Networks (GCNs) have shown strong performance in learning text representations for various tasks such as text classification.
We propose to transform the document graph into a word graph, to decouple data samples and a GCN model by using a document-independent graph.
The proposed Word-level Graph (WGraph) can not only implicitly learning word presentation with commonly-used word co-occurrences in corpora, but also incorporate extra global semantic dependency.
arXiv Detail & Related papers (2023-05-10T19:56:55Z) - A Robust Stacking Framework for Training Deep Graph Models with
Multifaceted Node Features [61.92791503017341]
Graph Neural Networks (GNNs) with numerical node features and graph structure as inputs have demonstrated superior performance on various supervised learning tasks with graph data.
The best models for such data types in most standard supervised learning settings with IID (non-graph) data are not easily incorporated into a GNN.
Here we propose a robust stacking framework that fuses graph-aware propagation with arbitrary models intended for IID data.
arXiv Detail & Related papers (2022-06-16T22:46:33Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - GraphFormers: GNN-nested Transformers for Representation Learning on
Textual Graph [53.70520466556453]
We propose GraphFormers, where layerwise GNN components are nested alongside the transformer blocks of language models.
With the proposed architecture, the text encoding and the graph aggregation are fused into an iterative workflow.
In addition, a progressive learning strategy is introduced, where the model is successively trained on manipulated data and original data to reinforce its capability of integrating information on graph.
arXiv Detail & Related papers (2021-05-06T12:20:41Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Heterogeneous Graph Neural Networks for Extractive Document
Summarization [101.17980994606836]
Cross-sentence relations are a crucial step in extractive document summarization.
We present a graph-based neural network for extractive summarization (HeterSumGraph)
We introduce different types of nodes into graph-based neural networks for extractive document summarization.
arXiv Detail & Related papers (2020-04-26T14:38:11Z) - Tensor Graph Convolutional Networks for Text Classification [17.21683037822181]
Graph-based neural networks exhibit some excellent properties, such as ability capturing global information.
In this paper, we investigate graph-based neural networks for text classification problem.
arXiv Detail & Related papers (2020-01-12T14:28:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.