DyCSC: Modeling the Evolutionary Process of Dynamic Networks Based on
Cluster Structure
- URL: http://arxiv.org/abs/2210.12690v1
- Date: Sun, 23 Oct 2022 10:23:08 GMT
- Title: DyCSC: Modeling the Evolutionary Process of Dynamic Networks Based on
Cluster Structure
- Authors: Shanfan Zhang, Zhan Bu
- Abstract summary: We propose a novel temporal network embedding method named Dynamic Cluster Structure Constraint model (DyCSC)
DyCSC captures the evolution of temporal networks by imposing a temporal constraint on the tendency of the nodes in the network to a given number of clusters.
It consistently outperforms competing methods by significant margins in multiple temporal link prediction tasks.
- Score: 1.005130974691351
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal networks are an important type of network whose topological
structure changes over time. Compared with methods on static networks, temporal
network embedding (TNE) methods are facing three challenges: 1) it cannot
describe the temporal dependence across network snapshots; 2) the node
embedding in the latent space fails to indicate changes in the network
topology; and 3) it cannot avoid a lot of redundant computation via parameter
inheritance on a series of snapshots. To this end, we propose a novel temporal
network embedding method named Dynamic Cluster Structure Constraint model
(DyCSC), whose core idea is to capture the evolution of temporal networks by
imposing a temporal constraint on the tendency of the nodes in the network to a
given number of clusters. It not only generates low-dimensional embedding
vectors for nodes but also preserves the dynamic nonlinear features of temporal
networks. Experimental results on multiple realworld datasets have demonstrated
the superiority of DyCSC for temporal graph embedding, as it consistently
outperforms competing methods by significant margins in multiple temporal link
prediction tasks. Moreover, the ablation study further validates the
effectiveness of the proposed temporal constraint.
Related papers
- CTRL: Continuous-Time Representation Learning on Temporal Heterogeneous Information Network [32.42051167404171]
We propose a Continuous-Time Representation Learning model on temporal HINs.
We train the model with a future event (a subgraph) prediction task to capture the evolution of the high-order network structure.
The results demonstrate that our model significantly boosts performance and outperforms various state-of-the-art approaches.
arXiv Detail & Related papers (2024-05-11T03:39:22Z) - Learning Persistent Community Structures in Dynamic Networks via
Topological Data Analysis [2.615648035076649]
We propose a novel deep graph clustering framework with temporal consistency regularization on inter-community structures.
MFC is a matrix factorization-based deep graph clustering algorithm that preserves node embedding.
TopoReg is introduced to ensure the preservation of topological similarity between inter-community structures over time intervals.
arXiv Detail & Related papers (2024-01-06T11:29:19Z) - DANI: Fast Diffusion Aware Network Inference with Preserving Topological
Structure Property [2.8948274245812327]
We propose a novel method called DANI to infer the underlying network while preserving its structural properties.
DANI has higher accuracy and lower run time while maintaining structural properties, including modular structure, degree distribution, connected components, density, and clustering coefficients.
arXiv Detail & Related papers (2023-10-02T23:23:00Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Direct Embedding of Temporal Network Edges via Time-Decayed Line Graphs [51.51417735550026]
Methods for machine learning on temporal networks generally exhibit at least one of two limitations.
We present a simple method that avoids both shortcomings: construct the line graph of the network, which includes a node for each interaction, and weigh the edges of this graph based on the difference in time between interactions.
Empirical results on real-world networks demonstrate our method's efficacy and efficiency on both edge classification and temporal link prediction.
arXiv Detail & Related papers (2022-09-30T18:24:13Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - Inductive Representation Learning in Temporal Networks via Causal
Anonymous Walks [51.79552974355547]
Temporal networks serve as abstractions of many real-world dynamic systems.
We propose Causal Anonymous Walks (CAWs) to inductively represent a temporal network.
CAWs are extracted by temporal random walks and work as automatic retrieval of temporal network motifs.
arXiv Detail & Related papers (2021-01-15T05:47:26Z) - EPNE: Evolutionary Pattern Preserving Network Embedding [26.06068388979255]
We propose EPNE, a temporal network embedding model preserving evolutionary patterns of the local structure of nodes.
With the adequate modeling of temporal information, our model is able to outperform other competitive methods in various prediction tasks.
arXiv Detail & Related papers (2020-09-24T06:31:14Z) - TempNodeEmb:Temporal Node Embedding considering temporal edge influence
matrix [0.8941624592392746]
Predicting future links among the nodes in temporal networks reveals an important aspect of the evolution of temporal networks.
Some approaches consider a simplified representation of temporal networks but in high-dimensional and generally sparse matrices.
We propose a new node embedding technique which exploits the evolving nature of the networks considering a simple three-layer graph neural network at each time step.
arXiv Detail & Related papers (2020-08-16T15:39:07Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.