Graph Convolutional Network for Swahili News Classification
- URL: http://arxiv.org/abs/2103.09325v1
- Date: Tue, 16 Mar 2021 21:03:47 GMT
- Title: Graph Convolutional Network for Swahili News Classification
- Authors: Alexandros Kastanos and Tyler Martin
- Abstract summary: This work empirically demonstrates the ability of Text Graph Convolutional Network (Text GCN) to outperform traditional natural language processing benchmarks for the task of semi-supervised Swahili news classification.
- Score: 78.6363825307044
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work empirically demonstrates the ability of Text Graph Convolutional
Network (Text GCN) to outperform traditional natural language processing
benchmarks for the task of semi-supervised Swahili news classification. In
particular, we focus our experimentation on the sparsely-labelled
semi-supervised context which is representative of the practical constraints
facing low-resourced African languages. We follow up on this result by
introducing a variant of the Text GCN model which utilises a bag of words
embedding rather than a naive one-hot encoding to reduce the memory footprint
of Text GCN whilst demonstrating similar predictive performance.
Related papers
- Chinese Financial Text Emotion Mining: GCGTS -- A Character
Relationship-based Approach for Simultaneous Aspect-Opinion Pair Extraction [7.484918031250864]
Aspect-Opinion Pair Extraction (AOPE) from Chinese financial texts is a specialized task in fine-grained text sentiment analysis.
Previous studies have mainly focused on developing grid annotation schemes within grid-based models to facilitate this extraction process.
We propose a novel method called Graph-based Character-level Grid Tagging Scheme (GCGTS)
The GCGTS method explicitly incorporates syntactic structure using Graph Convolutional Networks (GCN) and unifies the encoding of characters within the same semantic unit (Chinese word level)
arXiv Detail & Related papers (2023-08-04T02:20:56Z) - Scalable Learning of Latent Language Structure With Logical Offline
Cycle Consistency [71.42261918225773]
Conceptually, LOCCO can be viewed as a form of self-learning where the semantic being trained is used to generate annotations for unlabeled text.
As an added bonus, the annotations produced by LOCCO can be trivially repurposed to train a neural text generation model.
arXiv Detail & Related papers (2023-05-31T16:47:20Z) - ChatGPT Informed Graph Neural Network for Stock Movement Prediction [8.889701868315717]
We introduce a novel framework that leverages ChatGPT's graph inference capabilities to enhance Graph Neural Networks (GNN)
Our framework adeptly extracts evolving network structures from textual data, and incorporates these networks into graph neural networks for subsequent predictive tasks.
arXiv Detail & Related papers (2023-05-28T21:11:59Z) - ChatGraph: Interpretable Text Classification by Converting ChatGPT
Knowledge to Graphs [54.48467003509595]
ChatGPT has shown superior performance in various natural language processing (NLP) tasks.
We propose a novel framework that leverages the power of ChatGPT for specific tasks, such as text classification.
Our method provides a more transparent decision-making process compared with previous text classification methods.
arXiv Detail & Related papers (2023-05-03T19:57:43Z) - Multi-Task Text Classification using Graph Convolutional Networks for
Large-Scale Low Resource Language [5.197307534263253]
Graph Convolutional Networks (GCN) have achieved state-of-art results on single text classification tasks.
Applying GCN for multi-task text classification is an unexplored area.
We study the use of GCN for the Telugu language in single and multi-task settings for four natural language processing (NLP) tasks.
arXiv Detail & Related papers (2022-05-02T20:44:12Z) - TextRGNN: Residual Graph Neural Networks for Text Classification [13.912147013558846]
TextRGNN is an improved GNN structure that introduces residual connection to deepen the convolution network depth.
Our structure can obtain a wider node receptive field and effectively suppress the over-smoothing of node features.
It can significantly improve the classification accuracy whether in corpus level or text level, and achieve SOTA performance on a wide range of text classification datasets.
arXiv Detail & Related papers (2021-12-30T13:48:58Z) - GTAE: Graph-Transformer based Auto-Encoders for Linguistic-Constrained
Text Style Transfer [119.70961704127157]
Non-parallel text style transfer has attracted increasing research interests in recent years.
Current approaches still lack the ability to preserve the content and even logic of original sentences.
We propose a method called Graph Transformer based Auto-GTAE, which models a sentence as a linguistic graph and performs feature extraction and style transfer at the graph level.
arXiv Detail & Related papers (2021-02-01T11:08:45Z) - Be More with Less: Hypergraph Attention Networks for Inductive Text
Classification [56.98218530073927]
Graph neural networks (GNNs) have received increasing attention in the research community and demonstrated their promising results on this canonical task.
Despite the success, their performance could be largely jeopardized in practice since they are unable to capture high-order interaction between words.
We propose a principled model -- hypergraph attention networks (HyperGAT) which can obtain more expressive power with less computational consumption for text representation learning.
arXiv Detail & Related papers (2020-11-01T00:21:59Z) - GINet: Graph Interaction Network for Scene Parsing [58.394591509215005]
We propose a Graph Interaction unit (GI unit) and a Semantic Context Loss (SC-loss) to promote context reasoning over image regions.
The proposed GINet outperforms the state-of-the-art approaches on the popular benchmarks, including Pascal-Context and COCO Stuff.
arXiv Detail & Related papers (2020-09-14T02:52:45Z) - VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification [21.96079052962283]
VGCN-BERT model combines the capability of BERT with a Vocabulary Graph Convolutional Network (VGCN)
In our experiments on several text classification datasets, our approach outperforms BERT and GCN alone.
arXiv Detail & Related papers (2020-04-12T22:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.