Classification of vertices on social networks by multiple approaches
- URL: http://arxiv.org/abs/2301.11288v1
- Date: Fri, 13 Jan 2023 09:42:55 GMT
- Title: Classification of vertices on social networks by multiple approaches
- Authors: Hac{\i} \.Ismail Aslan, Chang Choi, Hoon Ko
- Abstract summary: In the case of social networks, it is crucial to evaluate the labels of discrete communities.
For each of these interaction-based entities, a social graph, a mailing dataset, and two citation sets are selected as the testbench repositories.
This paper was not only assessed the most valuable method but also determined how graph neural networks work.
- Score: 1.370151489527964
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Due to the advent of the expressions of data other than tabular formats, the
topological compositions which make samples interrelated came into prominence.
Analogically, those networks can be interpreted as social connections, dataflow
maps, citation influence graphs, protein bindings, etc. However, in the case of
social networks, it is highly crucial to evaluate the labels of discrete
communities. The reason underneath for such a study is the non-negligible
importance of analyzing graph networks to partition the vertices by using the
topological features of network graphs, solely. For each of these
interaction-based entities, a social graph, a mailing dataset, and two citation
sets are selected as the testbench repositories. This paper, it was not only
assessed the most valuable method but also determined how graph neural networks
work and the need to improve against non-neural network approaches which are
faster and computationally cost-effective. Also, this paper showed a limit to
be excesses by prospective graph neural network variations by using the
topological features of networks trialed.
Related papers
- GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - On Discprecncies between Perturbation Evaluations of Graph Neural
Network Attributions [49.8110352174327]
We assess attribution methods from a perspective not previously explored in the graph domain: retraining.
The core idea is to retrain the network on important (or not important) relationships as identified by the attributions.
We run our analysis on four state-of-the-art GNN attribution methods and five synthetic and real-world graph classification datasets.
arXiv Detail & Related papers (2024-01-01T02:03:35Z) - A Survey on Graph Classification and Link Prediction based on GNN [11.614366568937761]
This review article delves into the world of graph convolutional neural networks.
It elaborates on the fundamentals of graph convolutional neural networks.
It elucidates the graph neural network models based on attention mechanisms and autoencoders.
arXiv Detail & Related papers (2023-07-03T09:08:01Z) - BS-GAT Behavior Similarity Based Graph Attention Network for Network
Intrusion Detection [20.287285893803244]
This paper proposes a graph neural network algorithm based on behavior similarity (BS-GAT) using graph attention network.
The results show that the proposed method is effective and has superior performance comparing to existing solutions.
arXiv Detail & Related papers (2023-04-07T09:42:07Z) - Graph Belief Propagation Networks [34.137798598227874]
We introduce a model that combines the advantages of graph neural networks and collective classification.
In our model, potentials on each node only depend on that node's features, and edge potentials are learned via a coupling matrix.
Our approach can be viewed as either an interpretable message-passing graph neural network or a collective classification method with higher capacity and modernized training.
arXiv Detail & Related papers (2021-06-06T05:24:06Z) - Spectral Embedding of Graph Networks [76.27138343125985]
We introduce an unsupervised graph embedding that trades off local node similarity and connectivity, and global structure.
The embedding is based on a generalized graph Laplacian, whose eigenvectors compactly capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2020-09-30T04:59:10Z) - Representation Learning of Graphs Using Graph Convolutional Multilayer
Networks Based on Motifs [17.823543937167848]
mGCMN is a novel framework which utilizes node feature information and the higher order local structure of the graph.
It will greatly improve the learning efficiency of the graph neural network and promote a brand-new learning mode establishment.
arXiv Detail & Related papers (2020-07-31T04:18:20Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z) - Analyzing Neural Networks Based on Random Graphs [77.34726150561087]
We perform a massive evaluation of neural networks with architectures corresponding to random graphs of various types.
We find that none of the classical numerical graph invariants by itself allows to single out the best networks.
We also find that networks with primarily short-range connections perform better than networks which allow for many long-range connections.
arXiv Detail & Related papers (2020-02-19T11:04:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.