SDGNN: Learning Node Representation for Signed Directed Networks
- URL: http://arxiv.org/abs/2101.02390v3
- Date: Sat, 27 Mar 2021 11:45:02 GMT
- Title: SDGNN: Learning Node Representation for Signed Directed Networks
- Authors: Junjie Huang, Huawei Shen, Liang Hou, Xueqi Cheng
- Abstract summary: Graph Neural Networks (GNNs) have received widespread attention and lead to state-of-the-art performance in learning node representations.
It is not trivial to transfer these models to signed directed networks, which are widely observed in the real world yet less studied.
We propose a novel Signed Directed Graph Neural Networks model named SDGNN to learn node embeddings for signed directed networks.
- Score: 43.15277366961127
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Network embedding is aimed at mapping nodes in a network into low-dimensional
vector representations. Graph Neural Networks (GNNs) have received widespread
attention and lead to state-of-the-art performance in learning node
representations. However, most GNNs only work in unsigned networks, where only
positive links exist. It is not trivial to transfer these models to signed
directed networks, which are widely observed in the real world yet less
studied. In this paper, we first review two fundamental sociological theories
(i.e., status theory and balance theory) and conduct empirical studies on
real-world datasets to analyze the social mechanism in signed directed
networks. Guided by related sociological theories, we propose a novel Signed
Directed Graph Neural Networks model named SDGNN to learn node embeddings for
signed directed networks. The proposed model simultaneously reconstructs link
signs, link directions, and signed directed triangles. We validate our model's
effectiveness on five real-world datasets, which are commonly used as the
benchmark for signed network embedding. Experiments demonstrate the proposed
model outperforms existing models, including feature-based methods, network
embedding methods, and several GNN methods.
Related papers
- Applying Self-supervised Learning to Network Intrusion Detection for
Network Flows with Graph Neural Network [8.318363497010969]
This paper studies the application of GNNs to identify the specific types of network flows in an unsupervised manner.
To the best of our knowledge, it is the first GNN-based self-supervised method for the multiclass classification of network flows in NIDS.
arXiv Detail & Related papers (2024-03-03T12:34:13Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - Theory of Graph Neural Networks: Representation and Learning [44.02161831977037]
Graph Neural Networks (GNNs) have become a popular learning model for prediction tasks on nodes, graphs and configurations of points.
This article summarizes a selection of the emerging theoretical results on approximation and learning properties of widely used message passing GNNs and higher-order GNNs.
arXiv Detail & Related papers (2022-04-16T02:08:50Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - Network Representation Learning: From Preprocessing, Feature Extraction
to Node Embedding [9.844802841686105]
Network representation learning (NRL) advances the conventional graph mining of social networks, knowledge graphs, and complex biomedical and physics information networks.
This survey paper reviews the design principles and the different node embedding techniques for network representation learning over homogeneous networks.
arXiv Detail & Related papers (2021-10-14T17:46:37Z) - Signed Bipartite Graph Neural Networks [42.32959912473691]
Signed bipartite networks are different from classical signed networks, which contain two different node sets and signed links between two node sets.
In this work, we firstly define the signed relationship of the same set of nodes and provide a new perspective for analyzing signed bipartite networks.
We then do some comprehensive analysis of balance theory from two perspectives on several real-world datasets.
arXiv Detail & Related papers (2021-08-22T05:15:45Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z) - Node Masking: Making Graph Neural Networks Generalize and Scale Better [71.51292866945471]
Graph Neural Networks (GNNs) have received a lot of interest in the recent times.
In this paper, we utilize some theoretical tools to better visualize the operations performed by state of the art spatial GNNs.
We introduce a simple concept, Node Masking, that allows them to generalize and scale better.
arXiv Detail & Related papers (2020-01-17T06:26:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.