Knowledge-Preserving Incremental Social Event Detection via
Heterogeneous GNNs
- URL: http://arxiv.org/abs/2101.08747v2
- Date: Sat, 13 Feb 2021 17:49:54 GMT
- Title: Knowledge-Preserving Incremental Social Event Detection via
Heterogeneous GNNs
- Authors: Yuwei Cao, Hao Peng, Jia Wu, Yingtong Dou, Jianxin Li, Philip S. Yu
- Abstract summary: We propose a novel Knowledge-Preserving Incremental Heterogeneous Graph Neural Network (KPGNN) for incremental social event detection.
KPGNN models complex social messages into unified social graphs to facilitate data utilization and explores the expressive power of GNNs for knowledge extraction.
It also leverages the inductive learning ability of GNNs to efficiently detect events and extends its knowledge from previously unseen data.
- Score: 72.09532817958932
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Social events provide valuable insights into group social behaviors and
public concerns and therefore have many applications in fields such as product
recommendation and crisis management. The complexity and streaming nature of
social messages make it appealing to address social event detection in an
incremental learning setting, where acquiring, preserving, and extending
knowledge are major concerns. Most existing methods, including those based on
incremental clustering and community detection, learn limited amounts of
knowledge as they ignore the rich semantics and structural information
contained in social data. Moreover, they cannot memorize previously acquired
knowledge. In this paper, we propose a novel Knowledge-Preserving Incremental
Heterogeneous Graph Neural Network (KPGNN) for incremental social event
detection. To acquire more knowledge, KPGNN models complex social messages into
unified social graphs to facilitate data utilization and explores the
expressive power of GNNs for knowledge extraction. To continuously adapt to the
incoming data, KPGNN adopts contrastive loss terms that cope with a changing
number of event classes. It also leverages the inductive learning ability of
GNNs to efficiently detect events and extends its knowledge from previously
unseen data. To deal with large social streams, KPGNN adopts a mini-batch
subgraph sampling strategy for scalable training, and periodically removes
obsolete data to maintain a dynamic embedding space. KPGNN requires no feature
engineering and has few hyperparameters to tune. Extensive experiment results
demonstrate the superiority of KPGNN over various baselines.
Related papers
- When Graph Neural Network Meets Causality: Opportunities, Methodologies and An Outlook [23.45046265345568]
Graph Neural Networks (GNNs) have emerged as powerful representation learning tools for capturing complex dependencies within diverse graph-structured data.
GNNs have raised serious concerns regarding their trustworthiness, including susceptibility to distribution shift, biases towards certain populations, and lack of explainability.
Integrating causal learning techniques into GNNs has sparked numerous ground-breaking studies since many GNN trustworthiness issues can be alleviated.
arXiv Detail & Related papers (2023-12-19T13:26:14Z) - Unsupervised Social Event Detection via Hybrid Graph Contrastive
Learning and Reinforced Incremental Clustering [17.148519270314313]
We propose a novel unsupervised social media event detection method via hybrid graph contrastive learning and reinforced incremental clustering.
We conduct comprehensive experiments to evaluate HCRC on the Twitter and Maven datasets.
arXiv Detail & Related papers (2023-12-08T08:56:59Z) - Exploring Causal Learning through Graph Neural Networks: An In-depth
Review [12.936700685252145]
We introduce a novel taxonomy that encompasses various state-of-the-art GNN methods employed in studying causality.
GNNs are further categorized based on their applications in the causality domain.
This review also touches upon the application of causal learning across diverse sectors.
arXiv Detail & Related papers (2023-11-25T10:46:06Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Knowledge Enhanced Neural Networks for relational domains [83.9217787335878]
We focus on a specific method, KENN, a Neural-Symbolic architecture that injects prior logical knowledge into a neural network.
In this paper, we propose an extension of KENN for relational data.
arXiv Detail & Related papers (2022-05-31T13:00:34Z) - Evidential Temporal-aware Graph-based Social Event Detection via
Dempster-Shafer Theory [76.4580340399321]
We propose ETGNN, a novel Evidential Temporal-aware Graph Neural Network.
We construct view-specific graphs whose nodes are the texts and edges are determined by several types of shared elements respectively.
Considering the view-specific uncertainty, the representations of all views are converted into mass functions through evidential deep learning (EDL) neural networks.
arXiv Detail & Related papers (2022-05-24T16:22:40Z) - Neural Networks Enhancement with Logical Knowledge [83.9217787335878]
We propose an extension of KENN for relational data.
The results show that KENN is capable of increasing the performances of the underlying neural network even in the presence relational data.
arXiv Detail & Related papers (2020-09-13T21:12:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.