Online Dynamic Network Embedding
- URL: http://arxiv.org/abs/2006.16478v1
- Date: Tue, 30 Jun 2020 02:21:37 GMT
- Title: Online Dynamic Network Embedding
- Authors: Haiwei Huang, Jinlong Li, Huimin He, Huanhuan Chen
- Abstract summary: We propose an algorithm Recurrent Neural Network Embedding (RNNE) to deal with dynamic network.
RNNE takes into account both static and dynamic characteristics of the network.
We evaluate RNNE on five networks and compare with several state-of-the-art algorithms.
- Score: 26.203786679460528
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Network embedding is a very important method for network data. However, most
of the algorithms can only deal with static networks. In this paper, we propose
an algorithm Recurrent Neural Network Embedding (RNNE) to deal with dynamic
network, which can be typically divided into two categories: a) topologically
evolving graphs whose nodes and edges will increase (decrease) over time; b)
temporal graphs whose edges contain time information. In order to handle the
changing size of dynamic networks, RNNE adds virtual node, which is not
connected to any other nodes, to the networks and replaces it when new node
arrives, so that the network size can be unified at different time. On the one
hand, RNNE pays attention to the direct links between nodes and the similarity
between the neighborhood structures of two nodes, trying to preserve the local
and global network structure. On the other hand, RNNE reduces the influence of
noise by transferring the previous embedding information. Therefore, RNNE can
take into account both static and dynamic characteristics of the network.We
evaluate RNNE on five networks and compare with several state-of-the-art
algorithms. The results demonstrate that RNNE has advantages over other
algorithms in reconstruction, classification and link predictions.
Related papers
- Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Collaborative Graph Neural Networks for Attributed Network Embedding [63.39495932900291]
Graph neural networks (GNNs) have shown prominent performance on attributed network embedding.
We propose COllaborative graph Neural Networks--CONN, a tailored GNN architecture for network embedding.
arXiv Detail & Related papers (2023-07-22T04:52:27Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Learning Asymmetric Embedding for Attributed Networks via Convolutional
Neural Network [19.611523749659355]
We propose a novel deep asymmetric attributed network embedding model based on convolutional graph neural network, called AAGCN.
The main idea is to maximally preserve the asymmetric proximity and asymmetric similarity of directed attributed networks.
We test the performance of AAGCN on three real-world networks for network reconstruction, link prediction, node classification and visualization tasks.
arXiv Detail & Related papers (2022-02-13T13:35:15Z) - DynACPD Embedding Algorithm for Prediction Tasks in Dynamic Networks [6.5361928329696335]
We present novel embedding methods for a dynamic network based on higher order tensor decompositions for tensorial representations of the dynamic network.
We demonstrate the power and efficiency of our approach by comparing our algorithms' performance on the link prediction task against an array of current baseline methods.
arXiv Detail & Related papers (2021-03-12T04:36:42Z) - Dynamic Graph: Learning Instance-aware Connectivity for Neural Networks [78.65792427542672]
Dynamic Graph Network (DG-Net) is a complete directed acyclic graph, where the nodes represent convolutional blocks and the edges represent connection paths.
Instead of using the same path of the network, DG-Net aggregates features dynamically in each node, which allows the network to have more representation ability.
arXiv Detail & Related papers (2020-10-02T16:50:26Z) - GloDyNE: Global Topology Preserving Dynamic Network Embedding [31.269883917366478]
Dynamic Network Embedding (DNE) aims to update node embeddings while preserving network topology at each time step.
We propose a novel strategy to diversely select the representative nodes over a network, which is coordinated with a new incremental learning paradigm.
Experiments show GloDyNE, with a small fraction of nodes being selected, can already achieve the superior or comparable performance.
arXiv Detail & Related papers (2020-08-05T05:10:15Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z) - Temporal Network Representation Learning via Historical Neighborhoods
Aggregation [28.397309507168128]
We propose the Embedding via Historical Neighborhoods Aggregation (EHNA) algorithm.
We first propose a temporal random walk that can identify relevant nodes in historical neighborhoods.
Then we apply a deep learning model which uses a custom attention mechanism to induce node embeddings.
arXiv Detail & Related papers (2020-03-30T04:18:48Z) - EdgeNets:Edge Varying Graph Neural Networks [179.99395949679547]
This paper puts forth a general framework that unifies state-of-the-art graph neural networks (GNNs) through the concept of EdgeNet.
An EdgeNet is a GNN architecture that allows different nodes to use different parameters to weigh the information of different neighbors.
This is a general linear and local operation that a node can perform and encompasses under one formulation all existing graph convolutional neural networks (GCNNs) as well as graph attention networks (GATs)
arXiv Detail & Related papers (2020-01-21T15:51:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.