Influencer Detection with Dynamic Graph Neural Networks
- URL: http://arxiv.org/abs/2211.09664v1
- Date: Tue, 15 Nov 2022 13:00:25 GMT
- Title: Influencer Detection with Dynamic Graph Neural Networks
- Authors: Elena Tiukhova, Emiliano Penaloza, Mar\'ia \'Oskarsd\'ottir, Hernan
Garcia, Alejandro Correa Bahnsen, Bart Baesens, Monique Snoeck, Cristi\'an
Bravo
- Abstract summary: We investigate different dynamic Graph Neural Networks (GNNs) configurations for influencer detection.
We show that using deep multi-head attention in GNN and encoding temporal attributes significantly improves performance.
- Score: 56.1837101824783
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Leveraging network information for prediction tasks has become a common
practice in many domains. Being an important part of targeted marketing,
influencer detection can potentially benefit from incorporating dynamic network
representation. In this work, we investigate different dynamic Graph Neural
Networks (GNNs) configurations for influencer detection and evaluate their
prediction performance using a unique corporate data set. We show that using
deep multi-head attention in GNN and encoding temporal attributes significantly
improves performance. Furthermore, our empirical evaluation illustrates that
capturing neighborhood representation is more beneficial that using network
centrality measures.
Related papers
- Influence Maximization via Graph Neural Bandits [54.45552721334886]
We set the IM problem in a multi-round diffusion campaign, aiming to maximize the number of distinct users that are influenced.
We propose the framework IM-GNB (Influence Maximization with Graph Neural Bandits), where we provide an estimate of the users' probabilities of being influenced.
arXiv Detail & Related papers (2024-06-18T17:54:33Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - INFLECT-DGNN: Influencer Prediction with Dynamic Graph Neural Networks [4.677411878315618]
We present INFLuencer prEdiCTion with Dynamic Graph Neural Networks (GNNs) and Recurrent Neural Networks (RNNs)
We introduce a novel profit-driven framework that supports decision-making based on model predictions.
Our research has significant implications for the fields of referral and targeted marketing.
arXiv Detail & Related papers (2023-07-16T19:04:48Z) - Anomal-E: A Self-Supervised Network Intrusion Detection System based on
Graph Neural Networks [0.0]
This paper investigates Graph Neural Networks (GNNs) application for self-supervised network intrusion and anomaly detection.
GNNs are a deep learning approach for graph-based data that incorporate graph structures into learning.
We present Anomal-E, a GNN approach to intrusion and anomaly detection that leverages edge features and graph topological structure in a self-supervised process.
arXiv Detail & Related papers (2022-07-14T10:59:39Z) - On the role of feedback in visual processing: a predictive coding
perspective [0.6193838300896449]
We consider deep convolutional networks (CNNs) as models of feed-forward visual processing and implement Predictive Coding (PC) dynamics.
We find that the network increasingly relies on top-down predictions as the noise level increases.
In addition, the accuracy of the network implementing PC dynamics significantly increases over time-steps, compared to its equivalent forward network.
arXiv Detail & Related papers (2021-06-08T10:07:23Z) - Scene Understanding for Autonomous Driving [0.0]
We study the behaviour of different configurations of RetinaNet, Faster R-CNN and Mask R-CNN presented in Detectron2.
We observe a significant improvement in performance after fine-tuning these models on the datasets of interest.
We run inference in unusual situations using out of context datasets, and present interesting results.
arXiv Detail & Related papers (2021-05-11T09:50:05Z) - Topological Uncertainty: Monitoring trained neural networks through
persistence of activation graphs [0.9786690381850356]
In industrial applications, data coming from an open-world setting might widely differ from the benchmark datasets on which a network was trained.
We develop a method to monitor trained neural networks based on the topological properties of their activation graphs.
arXiv Detail & Related papers (2021-05-07T14:16:03Z) - Variational Structured Attention Networks for Deep Visual Representation
Learning [49.80498066480928]
We propose a unified deep framework to jointly learn both spatial attention maps and channel attention in a principled manner.
Specifically, we integrate the estimation and the interaction of the attentions within a probabilistic representation learning framework.
We implement the inference rules within the neural network, thus allowing for end-to-end learning of the probabilistic and the CNN front-end parameters.
arXiv Detail & Related papers (2021-03-05T07:37:24Z) - Information Obfuscation of Graph Neural Networks [96.8421624921384]
We study the problem of protecting sensitive attributes by information obfuscation when learning with graph structured data.
We propose a framework to locally filter out pre-determined sensitive attributes via adversarial training with the total variation and the Wasserstein distance.
arXiv Detail & Related papers (2020-09-28T17:55:04Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.