Community-Aware Temporal Walks: Parameter-Free Representation Learning on Continuous-Time Dynamic Graphs
- URL: http://arxiv.org/abs/2501.11880v1
- Date: Tue, 21 Jan 2025 04:16:46 GMT
- Title: Community-Aware Temporal Walks: Parameter-Free Representation Learning on Continuous-Time Dynamic Graphs
- Authors: He Yu, Jing Liu,
- Abstract summary: Community-aware Temporal Walks (CTWalks) is a novel framework for representation learning on continuous-time dynamic graphs.
CTWalks integrates a community-based parameter-free temporal walk sampling mechanism, an anonymization strategy enriched with community labels, and an encoding process.
Experiments on benchmark datasets demonstrate that CTWalks outperforms established methods in temporal link prediction tasks.
- Score: 3.833708891059351
- License:
- Abstract: Dynamic graph representation learning plays a crucial role in understanding evolving behaviors. However, existing methods often struggle with flexibility, adaptability, and the preservation of temporal and structural dynamics. To address these issues, we propose Community-aware Temporal Walks (CTWalks), a novel framework for representation learning on continuous-time dynamic graphs. CTWalks integrates three key components: a community-based parameter-free temporal walk sampling mechanism, an anonymization strategy enriched with community labels, and an encoding process that leverages continuous temporal dynamics modeled via ordinary differential equations (ODEs). This design enables precise modeling of both intra- and inter-community interactions, offering a fine-grained representation of evolving temporal patterns in continuous-time dynamic graphs. CTWalks theoretically overcomes locality bias in walks and establishes its connection to matrix factorization. Experiments on benchmark datasets demonstrate that CTWalks outperforms established methods in temporal link prediction tasks, achieving higher accuracy while maintaining robustness.
Related papers
- Decoupled Marked Temporal Point Process using Neural Ordinary Differential Equations [14.828081841581296]
A Marked Temporal Point Process (MTPP) is a process whose realization is a set of event-time data.
Recent studies have utilized deep neural networks to capture complex temporal dependencies of events.
We propose a Decoupled MTPP framework that disentangles characterization of a process into a set of evolving influences from different events.
arXiv Detail & Related papers (2024-06-10T10:15:32Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Continuous-time Graph Representation with Sequential Survival Process [0.17265013728931003]
We propose a process relying on survival functions to model the durations of links and their absences over time.
GraSSP: Graph Representation with Sequential Survival Process forms a generic new likelihood explicitly accounting for intermittent edge-persistent networks.
We quantitatively assess the developed framework in various downstream tasks, such as link prediction and network completion.
arXiv Detail & Related papers (2023-12-20T14:46:54Z) - Spatio-Temporal Branching for Motion Prediction using Motion Increments [55.68088298632865]
Human motion prediction (HMP) has emerged as a popular research topic due to its diverse applications.
Traditional methods rely on hand-crafted features and machine learning techniques.
We propose a noveltemporal-temporal branching network using incremental information for HMP.
arXiv Detail & Related papers (2023-08-02T12:04:28Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Piecewise-Velocity Model for Learning Continuous-time Dynamic Node
Representations [0.0]
Piecewise-Veable Model (PiVeM) for representation of continuous-time dynamic networks.
We show that PiVeM can successfully represent network structure and dynamics in ultra-low two-dimensional spaces.
It outperforms relevant state-of-art methods in downstream tasks such as link prediction.
arXiv Detail & Related papers (2022-12-23T13:57:56Z) - DyTed: Disentangled Representation Learning for Discrete-time Dynamic
Graph [59.583555454424]
We propose a novel disenTangled representation learning framework for discrete-time Dynamic graphs, namely DyTed.
We specially design a temporal-clips contrastive learning task together with a structure contrastive learning to effectively identify the time-invariant and time-varying representations respectively.
arXiv Detail & Related papers (2022-10-19T14:34:12Z) - ConTIG: Continuous Representation Learning on Temporal Interaction
Graphs [32.25218861788686]
ConTIG is a continuous representation method that captures the continuous dynamic evolution of node embedding trajectories.
Our model exploit three-fold factors in dynamic networks which include latest interaction, neighbor features and inherent characteristics.
Experiments results demonstrate the superiority of ConTIG on temporal link prediction, temporal node recommendation and dynamic node classification tasks.
arXiv Detail & Related papers (2021-09-27T12:11:24Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - GRADE: Graph Dynamic Embedding [76.85156209917932]
GRADE is a probabilistic model that learns to generate evolving node and community representations by imposing a random walk prior to their trajectories.
Our model also learns node community membership which is updated between time steps via a transition matrix.
Experiments demonstrate GRADE outperforms baselines in dynamic link prediction, shows favourable performance on dynamic community detection, and identifies coherent and interpretable evolving communities.
arXiv Detail & Related papers (2020-07-16T01:17:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.