Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution
- URL: http://arxiv.org/abs/2207.00594v1
- Date: Fri, 1 Jul 2022 15:32:56 GMT
- Title: Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution
- Authors: Yu Yang, Hongzhi Yin, Jiannong Cao, Tong Chen, Quoc Viet Hung Nguyen,
Xiaofang Zhou and Lei Chen
- Abstract summary: Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
- Score: 60.695162101159134
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Dynamic graphs refer to graphs whose structure dynamically changes over time.
Despite the benefits of learning vertex representations (i.e., embeddings) for
dynamic graphs, existing works merely view a dynamic graph as a sequence of
changes within the vertex connections, neglecting the crucial asynchronous
nature of such dynamics where the evolution of each local structure starts at
different times and lasts for various durations. To maintain asynchronous
structural evolutions within the graph, we innovatively formulate dynamic
graphs as temporal edge sequences associated with joining time of vertices
(ToV) and timespan of edges (ToE). Then, a time-aware Transformer is proposed
to embed vertices' dynamic connections and ToEs into the learned vertex
representations. Meanwhile, we treat each edge sequence as a whole and embed
its ToV of the first vertex to further encode the time-sensitive information.
Extensive evaluations on several datasets show that our approach outperforms
the state-of-the-art in a wide range of graph mining tasks. At the same time,
it is very efficient and scalable for embedding large-scale dynamic graphs.
Related papers
- Supra-Laplacian Encoding for Transformer on Dynamic Graphs [14.293220696079919]
We present a new-temporal encoding for GT architecture while keeping temporal information.
Specifically, we transform Time Dynamic Graphplas into multi-layer graphs and take advantage of the spectral properties of their associated supra-lacian matrix.
Our second contribution explicitly model nodes pairwise with a cross-attention mechanism, providing an accurate edge representation for dynamic link prediction.
arXiv Detail & Related papers (2024-09-26T15:56:40Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Decoupled Graph Neural Networks for Large Dynamic Graphs [14.635923016087503]
We propose a decoupled graph neural network for large dynamic graphs.
We show that our algorithm achieves state-of-the-art performance in both kinds of dynamic graphs.
arXiv Detail & Related papers (2023-05-14T23:00:10Z) - Dynamic Graph Representation Learning via Edge Temporal States Modeling and Structure-reinforced Transformer [5.093187534912688]
We introduce the Recurrent Structure-reinforced Graph Transformer (RSGT), a novel framework for dynamic graph representation learning.
RSGT captures temporal node representations encoding both graph topology and evolving dynamics through a recurrent learning paradigm.
We show RSGT's superior performance in discrete dynamic graph representation learning, consistently outperforming existing methods in dynamic link prediction tasks.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Self-Supervised Temporal Graph learning with Temporal and Structural Intensity Alignment [53.72873672076391]
Temporal graph learning aims to generate high-quality representations for graph-based tasks with dynamic information.
We propose a self-supervised method called S2T for temporal graph learning, which extracts both temporal and structural information.
S2T achieves at most 10.13% performance improvement compared with the state-of-the-art competitors on several datasets.
arXiv Detail & Related papers (2023-02-15T06:36:04Z) - Multi-Task Edge Prediction in Temporally-Dynamic Video Graphs [16.121140184388786]
We propose MTD-GNN, a graph network for predicting temporally-dynamic edges for multiple types of relations.
We show that modeling multiple relations in our temporal-dynamic graph network can be mutually beneficial.
arXiv Detail & Related papers (2022-12-06T10:41:00Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Dynamic Graph Learning-Neural Network for Multivariate Time Series
Modeling [2.3022070933226217]
We propose a novel framework, namely static- and dynamic-graph learning-neural network (GL)
The model acquires static and dynamic graph matrices from data to model long-term and short-term patterns respectively.
It achieves state-of-the-art performance on almost all datasets.
arXiv Detail & Related papers (2021-12-06T08:19:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.