Graph Neural Networks for Text Classification: A Survey
- URL: http://arxiv.org/abs/2304.11534v3
- Date: Fri, 5 Jul 2024 12:06:26 GMT
- Title: Graph Neural Networks for Text Classification: A Survey
- Authors: Kunze Wang, Yihao Ding, Soyeon Caren Han,
- Abstract summary: Graph neural network-based models can deal with complex structured text data and exploit global information.
We bring the coverage of methods up to 2023, including corpus-level and document-level graph neural networks.
As well as the technological survey, we look at issues behind and future directions addressed in text classification using graph neural networks.
- Score: 8.414181339242706
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Text Classification is the most essential and fundamental problem in Natural Language Processing. While numerous recent text classification models applied the sequential deep learning technique, graph neural network-based models can directly deal with complex structured text data and exploit global information. Many real text classification applications can be naturally cast into a graph, which captures words, documents, and corpus global features. In this survey, we bring the coverage of methods up to 2023, including corpus-level and document-level graph neural networks. We discuss each of these methods in detail, dealing with the graph construction mechanisms and the graph-based learning process. As well as the technological survey, we look at issues behind and future directions addressed in text classification using graph neural networks. We also cover datasets, evaluation metrics, and experiment design and present a summary of published performance on the publicly available benchmarks. Note that we present a comprehensive comparison between different techniques and identify the pros and cons of various evaluation metrics in this survey.
Related papers
- Graph Neural Networks on Discriminative Graphs of Words [19.817473565906777]
In this work, we explore a new Discriminative Graph of Words Graph Neural Network (DGoW-GNN) approach to classify text.
We propose a new model for the graph-based classification of text, which combines a GNN and a sequence model.
We evaluate our approach on seven benchmark datasets and find that it is outperformed by several state-of-the-art baseline models.
arXiv Detail & Related papers (2024-10-27T15:14:06Z) - Pre-Training and Prompting for Few-Shot Node Classification on Text-Attributed Graphs [35.44563283531432]
Text-attributed graph (TAG) is one kind of important real-world graph-structured data with each node associated with raw texts.
For TAGs, traditional few-shot node classification methods directly conduct training on the pre-processed node features.
We propose P2TAG, a framework designed for few-shot node classification on TAGs with graph pre-training and prompting.
arXiv Detail & Related papers (2024-07-22T07:24:21Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - A semantic hierarchical graph neural network for text classification [1.439766998338892]
We propose a new hierarchical graph neural network (HieGNN) which extracts corresponding information from word-level, sentence-level and document-level respectively.
Experimental results on several benchmark datasets achieve better or similar results compared to several baseline methods.
arXiv Detail & Related papers (2022-09-15T03:59:31Z) - Hierarchical Heterogeneous Graph Representation Learning for Short Text
Classification [60.233529926965836]
We propose a new method called SHINE, which is based on graph neural network (GNN) for short text classification.
First, we model the short text dataset as a hierarchical heterogeneous graph consisting of word-level component graphs.
Then, we dynamically learn a short document graph that facilitates effective label propagation among similar short texts.
arXiv Detail & Related papers (2021-10-30T05:33:05Z) - GraphFormers: GNN-nested Transformers for Representation Learning on
Textual Graph [53.70520466556453]
We propose GraphFormers, where layerwise GNN components are nested alongside the transformer blocks of language models.
With the proposed architecture, the text encoding and the graph aggregation are fused into an iterative workflow.
In addition, a progressive learning strategy is introduced, where the model is successively trained on manipulated data and original data to reinforce its capability of integrating information on graph.
arXiv Detail & Related papers (2021-05-06T12:20:41Z) - Be More with Less: Hypergraph Attention Networks for Inductive Text
Classification [56.98218530073927]
Graph neural networks (GNNs) have received increasing attention in the research community and demonstrated their promising results on this canonical task.
Despite the success, their performance could be largely jeopardized in practice since they are unable to capture high-order interaction between words.
We propose a principled model -- hypergraph attention networks (HyperGAT) which can obtain more expressive power with less computational consumption for text representation learning.
arXiv Detail & Related papers (2020-11-01T00:21:59Z) - A Survey on Text Classification: From Shallow to Deep Learning [83.47804123133719]
The last decade has seen a surge of research in this area due to the unprecedented success of deep learning.
This paper fills the gap by reviewing the state-of-the-art approaches from 1961 to 2021.
We create a taxonomy for text classification according to the text involved and the models used for feature extraction and classification.
arXiv Detail & Related papers (2020-08-02T00:09:03Z) - Heterogeneous Graph Neural Networks for Extractive Document
Summarization [101.17980994606836]
Cross-sentence relations are a crucial step in extractive document summarization.
We present a graph-based neural network for extractive summarization (HeterSumGraph)
We introduce different types of nodes into graph-based neural networks for extractive document summarization.
arXiv Detail & Related papers (2020-04-26T14:38:11Z) - Tensor Graph Convolutional Networks for Text Classification [17.21683037822181]
Graph-based neural networks exhibit some excellent properties, such as ability capturing global information.
In this paper, we investigate graph-based neural networks for text classification problem.
arXiv Detail & Related papers (2020-01-12T14:28:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.