Challenges in Pre-Training Graph Neural Networks for Context-Based Fake
News Detection: An Evaluation of Current Strategies and Resource Limitations
- URL: http://arxiv.org/abs/2402.18179v1
- Date: Wed, 28 Feb 2024 09:10:25 GMT
- Title: Challenges in Pre-Training Graph Neural Networks for Context-Based Fake
News Detection: An Evaluation of Current Strategies and Resource Limitations
- Authors: Gregor Donabauer and Udo Kruschwitz
- Abstract summary: We propose to apply pre-training of Graph Neural Networks (GNNs) in the domain of context-based fake news detection.
Our experiments provide an evaluation of different pre-training strategies for graph-based misinformation detection.
We argue that a major current issue is the lack of suitable large-scale resources that can be used for pre-training.
- Score: 1.9870554622325414
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Pre-training of neural networks has recently revolutionized the field of
Natural Language Processing (NLP) and has before demonstrated its effectiveness
in computer vision. At the same time, advances around the detection of fake
news were mainly driven by the context-based paradigm, where different types of
signals (e.g. from social media) form graph-like structures that hold
contextual information apart from the news article to classify. We propose to
merge these two developments by applying pre-training of Graph Neural Networks
(GNNs) in the domain of context-based fake news detection. Our experiments
provide an evaluation of different pre-training strategies for graph-based
misinformation detection and demonstrate that transfer learning does currently
not lead to significant improvements over training a model from scratch in the
domain. We argue that a major current issue is the lack of suitable large-scale
resources that can be used for pre-training.
Related papers
- Learning How to Propagate Messages in Graph Neural Networks [55.2083896686782]
This paper studies the problem of learning message propagation strategies for graph neural networks (GNNs)
We introduce the optimal propagation steps as latent variables to help find the maximum-likelihood estimation of the GNN parameters.
Our proposed framework can effectively learn personalized and interpretable propagate strategies of messages in GNNs.
arXiv Detail & Related papers (2023-10-01T15:09:59Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - Story Point Effort Estimation by Text Level Graph Neural Network [2.652428960991066]
Graph Neural Network is a new approach that has been applied in Natural Language Processing for text classification.
We show the potential and possible challenges of Graph Neural Network text classification in story point level estimation.
arXiv Detail & Related papers (2022-03-06T22:15:03Z) - Embedding Graph Convolutional Networks in Recurrent Neural Networks for
Predictive Monitoring [0.0]
This paper proposes an approach based on graph convolutional networks and recurrent neural networks.
An experimental evaluation on real-life event logs shows that our approach is more consistent and outperforms the current state-of-the-art approaches.
arXiv Detail & Related papers (2021-12-17T17:30:30Z) - Towards Open-World Feature Extrapolation: An Inductive Graph Learning
Approach [80.8446673089281]
We propose a new learning paradigm with graph representation and learning.
Our framework contains two modules: 1) a backbone network (e.g., feedforward neural nets) as a lower model takes features as input and outputs predicted labels; 2) a graph neural network as an upper model learns to extrapolate embeddings for new features via message passing over a feature-data graph built from observed data.
arXiv Detail & Related papers (2021-10-09T09:02:45Z) - How Neural Processes Improve Graph Link Prediction [35.652234989200956]
We propose a meta-learning approach with graph neural networks for link prediction: Neural Processes for Graph Neural Networks (NPGNN)
NPGNN can perform both transductive and inductive learning tasks and adapt to patterns in a large new graph after training with a small subgraph.
arXiv Detail & Related papers (2021-09-30T07:35:13Z) - Overcoming Catastrophic Forgetting in Graph Neural Networks [50.900153089330175]
Catastrophic forgetting refers to the tendency that a neural network "forgets" the previous learned knowledge upon learning new tasks.
We propose a novel scheme dedicated to overcoming this problem and hence strengthen continual learning in graph neural networks (GNNs)
At the heart of our approach is a generic module, termed as topology-aware weight preserving(TWP)
arXiv Detail & Related papers (2020-12-10T22:30:25Z) - Adversarially-Trained Deep Nets Transfer Better: Illustration on Image
Classification [53.735029033681435]
Transfer learning is a powerful methodology for adapting pre-trained deep neural networks on image recognition tasks to new domains.
In this work, we demonstrate that adversarially-trained models transfer better than non-adversarially-trained models.
arXiv Detail & Related papers (2020-07-11T22:48:42Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Minimax Lower Bounds for Transfer Learning with Linear and One-hidden
Layer Neural Networks [27.44348371795822]
We develop a statistical minimax framework to characterize the limits of transfer learning.
We derive a lower-bound for the target generalization error achievable by any algorithm as a function of the number of labeled source and target data.
arXiv Detail & Related papers (2020-06-16T22:49:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.