Realistic Synthetic Social Networks with Graph Neural Networks
- URL: http://arxiv.org/abs/2212.07843v1
- Date: Thu, 15 Dec 2022 14:04:27 GMT
- Title: Realistic Synthetic Social Networks with Graph Neural Networks
- Authors: Alex Davies and Nirav Ajmeri
- Abstract summary: We evaluate the potential of Graph Neural Network (GNN) models for network generation for synthetic social networks.
We include social network specific measurements which allow evaluation of how realistically synthetic networks behave.
We find that the Gated Recurrent Attention Network (GRAN) extends well to social networks, and in comparison to a benchmark popular rule-based generation Recursive-MATrix (R-MAT) method, is better able to replicate realistic structural dynamics.
- Score: 1.8275108630751837
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Social network analysis faces profound difficulties in sharing data between
researchers due to privacy and security concerns. A potential remedy to this
issue are synthetic networks, that closely resemble their real counterparts,
but can be freely distributed. generating synthetic networks requires the
creation of network topologies that, in application, function as realistically
as possible. Widely applied models are currently rule-based and can struggle to
reproduce structural dynamics. Lead by recent developments in Graph Neural
Network (GNN) models for network generation we evaluate the potential of GNNs
for synthetic social networks. Our GNN use is specifically within a reasonable
use-case and includes empirical evaluation using Maximum Mean Discrepancy
(MMD). We include social network specific measurements which allow evaluation
of how realistically synthetic networks behave in typical social network
analysis applications.
We find that the Gated Recurrent Attention Network (GRAN) extends well to
social networks, and in comparison to a benchmark popular rule-based generation
Recursive-MATrix (R-MAT) method, is better able to replicate realistic
structural dynamics. We find that GRAN is more computationally costly than
R-MAT, but is not excessively costly to employ, so would be effective for
researchers seeking to create datasets of synthetic social networks.
Related papers
- Modeling Social Media Recommendation Impacts Using Academic Networks: A Graph Neural Network Approach [4.138915764680197]
This study proposes to use academic social networks as a proxy for investigating recommendation systems in social media.
By employing Graph Neural Networks (GNNs), we develop a model that separates the prediction of academic infosphere from behavior prediction.
Our approach aims to improve our understanding of recommendation systems' roles and social networks modeling.
arXiv Detail & Related papers (2024-10-06T17:03:27Z) - TDNetGen: Empowering Complex Network Resilience Prediction with Generative Augmentation of Topology and Dynamics [14.25304439234864]
We introduce a novel resilience prediction framework for complex networks, designed to tackle this issue through generative data augmentation of network topology and dynamics.
Experiment results on three network datasets demonstrate that our proposed framework TDNetGen can achieve high prediction accuracy up to 85%-95%.
arXiv Detail & Related papers (2024-08-19T09:20:31Z) - Building a Graph-based Deep Learning network model from captured traffic
traces [4.671648049111933]
State of the art network models are based or depend on Discrete Event Simulation (DES)
DES is highly accurate, it is also computationally costly and cumbersome to parallelize, making it unpractical to simulate high performance networks.
We propose a Graph Neural Network (GNN)-based solution specifically designed to better capture the complexities of real network scenarios.
arXiv Detail & Related papers (2023-10-18T11:16:32Z) - Riemannian Residual Neural Networks [58.925132597945634]
We show how to extend the residual neural network (ResNet)
ResNets have become ubiquitous in machine learning due to their beneficial learning properties, excellent empirical results, and easy-to-incorporate nature when building varied neural networks.
arXiv Detail & Related papers (2023-10-16T02:12:32Z) - Generalization and Estimation Error Bounds for Model-based Neural
Networks [78.88759757988761]
We show that the generalization abilities of model-based networks for sparse recovery outperform those of regular ReLU networks.
We derive practical design rules that allow to construct model-based networks with guaranteed high generalization.
arXiv Detail & Related papers (2023-04-19T16:39:44Z) - Cost Sensitive GNN-based Imbalanced Learning for Mobile Social Network
Fraud Detection [37.14877936257601]
We present a novel Cost-Sensitive Graph Neural Network (CSGNN) by creatively combining cost-sensitive learning and graph neural networks.
The results show that CSGNN can effectively solve the graph imbalance problem and then achieve better detection performance than the state-of-the-art algorithms.
arXiv Detail & Related papers (2023-03-28T01:43:32Z) - An Approach for Link Prediction in Directed Complex Networks based on
Asymmetric Similarity-Popularity [0.0]
This paper introduces a link prediction method designed explicitly for directed networks.
It is based on the similarity-popularity paradigm, which has recently proven successful in undirected networks.
The algorithms approximate the hidden similarities as shortest path distances using edge weights that capture and factor out the links' asymmetry and nodes' popularity.
arXiv Detail & Related papers (2022-07-15T11:03:25Z) - Trustworthy Graph Neural Networks: Aspects, Methods and Trends [115.84291569988748]
Graph neural networks (GNNs) have emerged as competent graph learning methods for diverse real-world scenarios.
Performance-oriented GNNs have exhibited potential adverse effects like vulnerability to adversarial attacks.
To avoid these unintentional harms, it is necessary to build competent GNNs characterised by trustworthiness.
arXiv Detail & Related papers (2022-05-16T02:21:09Z) - Quasi-orthogonality and intrinsic dimensions as measures of learning and
generalisation [55.80128181112308]
We show that dimensionality and quasi-orthogonality of neural networks' feature space may jointly serve as network's performance discriminants.
Our findings suggest important relationships between the networks' final performance and properties of their randomly initialised feature spaces.
arXiv Detail & Related papers (2022-03-30T21:47:32Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Towards Understanding Theoretical Advantages of Complex-Reaction
Networks [77.34726150561087]
We show that a class of functions can be approximated by a complex-reaction network using the number of parameters.
For empirical risk minimization, our theoretical result shows that the critical point set of complex-reaction networks is a proper subset of that of real-valued networks.
arXiv Detail & Related papers (2021-08-15T10:13:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.