INFLECT-DGNN: Influencer Prediction with Dynamic Graph Neural Networks
- URL: http://arxiv.org/abs/2307.08131v3
- Date: Tue, 12 Dec 2023 13:36:28 GMT
- Title: INFLECT-DGNN: Influencer Prediction with Dynamic Graph Neural Networks
- Authors: Elena Tiukhova, Emiliano Penaloza, Mar\'ia \'Oskarsd\'ottir, Bart
Baesens, Monique Snoeck, Cristi\'an Bravo
- Abstract summary: We introduce INFLECT-DGNN, a new framework for INFLuencer prEdiCTion with Dynamic Graph Neural Networks (GNN) and Recurrent Neural Networks (RNN)
Our results show how using RNN to encode temporal attributes alongside GNNs significantly improves predictive performance.
- Score: 2.8497910326197586
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Leveraging network information for predictive modeling has become widespread
in many domains. Within the realm of referral and targeted marketing,
influencer detection stands out as an area that could greatly benefit from the
incorporation of dynamic network representation due to the ongoing development
of customer-brand relationships. To elaborate this idea, we introduce
INFLECT-DGNN, a new framework for INFLuencer prEdiCTion with Dynamic Graph
Neural Networks that combines Graph Neural Networks (GNN) and Recurrent Neural
Networks (RNN) with weighted loss functions, the Synthetic Minority
Oversampling TEchnique (SMOTE) adapted for graph data, and a carefully crafted
rolling-window strategy. To evaluate predictive performance, we utilize a
unique corporate data set with networks of three cities and derive a
profit-driven evaluation methodology for influencer prediction. Our results
show how using RNN to encode temporal attributes alongside GNNs significantly
improves predictive performance. We compare the results of various models to
demonstrate the importance of capturing graph representation, temporal
dependencies, and using a profit-driven methodology for evaluation.
Related papers
- Kolmogorov-Arnold Graph Neural Networks [2.4005219869876453]
Graph neural networks (GNNs) excel in learning from network-like data but often lack interpretability.
We propose the Graph Kolmogorov-Arnold Network (GKAN) to enhance both accuracy and interpretability.
arXiv Detail & Related papers (2024-06-26T13:54:59Z) - Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Uncertainty in Graph Neural Networks: A Survey [50.63474656037679]
Graph Neural Networks (GNNs) have been extensively used in various real-world applications.
However, the predictive uncertainty of GNNs stemming from diverse sources can lead to unstable and erroneous predictions.
This survey aims to provide a comprehensive overview of the GNNs from the perspective of uncertainty.
arXiv Detail & Related papers (2024-03-11T21:54:52Z) - Exploring Time Granularity on Temporal Graphs for Dynamic Link
Prediction in Real-world Networks [0.48346848229502226]
Dynamic Graph Neural Networks (DGNNs) have emerged as the predominant approach for processing dynamic graph-structured data.
In this paper, we explore the impact of time granularity when training DGNNs on dynamic graphs through extensive experiments.
arXiv Detail & Related papers (2023-11-21T00:34:53Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Causality-based CTR Prediction using Graph Neural Networks [14.93804796744474]
This paper develops a causality-based CTR prediction model in the graph neural networks framework (Causal-GNN)
It integrates representations of feature graph, user graph and ad graph in the context of online advertising.
Experiments conducted on three public datasets demonstrate the superiority of Causal-GNN in AUC and Logloss.
arXiv Detail & Related papers (2023-01-30T10:16:40Z) - Influencer Detection with Dynamic Graph Neural Networks [56.1837101824783]
We investigate different dynamic Graph Neural Networks (GNNs) configurations for influencer detection.
We show that using deep multi-head attention in GNN and encoding temporal attributes significantly improves performance.
arXiv Detail & Related papers (2022-11-15T13:00:25Z) - Spiking Variational Graph Auto-Encoders for Efficient Graph
Representation Learning [10.65760757021534]
We propose an SNN-based deep generative method, namely the Spiking Variational Graph Auto-Encoders (S-VGAE) for efficient graph representation learning.
We conduct link prediction experiments on multiple benchmark graph datasets, and the results demonstrate that our model consumes significantly lower energy with the performances superior or comparable to other ANN- and SNN-based methods for graph representation learning.
arXiv Detail & Related papers (2022-10-24T12:54:41Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.