Dynamic Network Embedding Survey
- URL: http://arxiv.org/abs/2103.15447v1
- Date: Mon, 29 Mar 2021 09:27:53 GMT
- Title: Dynamic Network Embedding Survey
- Authors: Guotong Xue, Ming Zhong, Jianxin Li, Jia Chen, Chengshuai Zhai,
Ruochen Kong
- Abstract summary: We give a survey of dynamic network embedding in this paper.
We present two basic data models, namely, discrete model and continuous model for dynamic networks.
We build a taxonomy that refines the category hierarchy by typical learning models.
- Score: 11.742863376032112
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Since many real world networks are evolving over time, such as social
networks and user-item networks, there are increasing research efforts on
dynamic network embedding in recent years. They learn node representations from
a sequence of evolving graphs but not only the latest network, for preserving
both structural and temporal information from the dynamic networks. Due to the
lack of comprehensive investigation of them, we give a survey of dynamic
network embedding in this paper. Our survey inspects the data model,
representation learning technique, evaluation and application of current
related works and derives common patterns from them. Specifically, we present
two basic data models, namely, discrete model and continuous model for dynamic
networks. Correspondingly, we summarize two major categories of dynamic network
embedding techniques, namely, structural-first and temporal-first that are
adopted by most related works. Then we build a taxonomy that refines the
category hierarchy by typical learning models. The popular experimental data
sets and applications are also summarized. Lastly, we have a discussion of
several distinct research topics in dynamic network embedding.
Related papers
- How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - A Network Classification Method based on Density Time Evolution Patterns
Extracted from Network Automata [0.0]
We propose alternate sources of information to use as descriptor for the classification, which we denominate as density time-evolution pattern (D-TEP) and state density time-evolution pattern (SD-TEP)
Our results show a significant improvement compared to previous studies at five synthetic network databases and also seven real world databases.
arXiv Detail & Related papers (2022-11-18T15:27:26Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - DPPIN: A Biological Dataset of Dynamic Protein-Protein Interaction
Networks [40.490606259328686]
We generate a new biological dataset of dynamic protein-protein interaction networks (i.e., DPPIN)
DPPIN consists of twelve dynamic protein-level interaction networks of yeast cells at different scales.
We design dynamic local clustering, dynamic spectral clustering, dynamic subgraph matching, dynamic node classification, and dynamic graph classification experiments.
arXiv Detail & Related papers (2021-07-05T17:52:55Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Dynamic Neural Networks: A Survey [34.30356864359789]
Dynamic neural network is an emerging research topic in deep learning.
Dynamic networks can adapt their structures or parameters to different inputs, leading to notable advantages in terms of accuracy, computational efficiency, adaptiveness, etc.
arXiv Detail & Related papers (2021-02-09T16:02:00Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z) - Foundations and modelling of dynamic networks using Dynamic Graph Neural
Networks: A survey [11.18312489268624]
We establish a foundation of dynamic networks with consistent, detailed terminology and notation.
We present a comprehensive survey of dynamic graph neural network models using the proposed terminology.
arXiv Detail & Related papers (2020-05-13T23:56:38Z) - Modeling Dynamic Heterogeneous Network for Link Prediction using
Hierarchical Attention with Temporal RNN [16.362525151483084]
We propose a novel dynamic heterogeneous network embedding method, termed as DyHATR.
It uses hierarchical attention to learn heterogeneous information and incorporates recurrent neural networks with temporal attention to capture evolutionary patterns.
We benchmark our method on four real-world datasets for the task of link prediction.
arXiv Detail & Related papers (2020-04-01T17:16:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.