FTM: A Frame-level Timeline Modeling Method for Temporal Graph
Representation Learning
- URL: http://arxiv.org/abs/2302.11814v1
- Date: Thu, 23 Feb 2023 06:53:16 GMT
- Title: FTM: A Frame-level Timeline Modeling Method for Temporal Graph
Representation Learning
- Authors: Bowen Cao, Qichen Ye, Weiyuan Xu, Yuexian Zou
- Abstract summary: We propose a Frame-level Timeline Modeling (FTM) method that helps to capture both short-term and long-term features.
Our method can be easily assembled with most temporal GNNs.
- Score: 47.52733127616005
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Learning representations for graph-structured data is essential for graph
analytical tasks. While remarkable progress has been made on static graphs,
researches on temporal graphs are still in its beginning stage. The bottleneck
of the temporal graph representation learning approach is the neighborhood
aggregation strategy, based on which graph attributes share and gather
information explicitly. Existing neighborhood aggregation strategies fail to
capture either the short-term features or the long-term features of temporal
graph attributes, leading to unsatisfactory model performance and even poor
robustness and domain generality of the representation learning method. To
address this problem, we propose a Frame-level Timeline Modeling (FTM) method
that helps to capture both short-term and long-term features and thus learns
more informative representations on temporal graphs. In particular, we present
a novel link-based framing technique to preserve the short-term features and
then incorporate a timeline aggregator module to capture the intrinsic dynamics
of graph evolution as long-term features. Our method can be easily assembled
with most temporal GNNs. Extensive experiments on common datasets show that our
method brings great improvements to the capability, robustness, and domain
generality of backbone methods in downstream tasks. Our code can be found at
https://github.com/yeeeqichen/FTM.
Related papers
- Dynamic and Textual Graph Generation Via Large-Scale LLM-based Agent Simulation [70.60461609393779]
GraphAgent-Generator (GAG) is a novel simulation-based framework for dynamic graph generation.
Our framework effectively replicates seven macro-level structural characteristics in established network science theories.
It supports generating graphs with up to nearly 100,000 nodes or 10 million edges, with a minimum speed-up of 90.4%.
arXiv Detail & Related papers (2024-10-13T12:57:08Z) - TempoKGAT: A Novel Graph Attention Network Approach for Temporal Graph Analysis [3.5707423185282656]
This paper presents a new type of graph attention network, called TempoKGAT, which combines time-decaying weight and a selective neighbor aggregation mechanism on the spatial domain.
We evaluate our approach on multiple datasets from the traffic, energy, and health sectors involvingtemporal data.
arXiv Detail & Related papers (2024-08-29T09:54:46Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Time-aware Graph Structure Learning via Sequence Prediction on Temporal
Graphs [10.034072706245544]
We propose a Time-aware Graph Structure Learning (TGSL) approach via sequence prediction on temporal graphs.
In particular, it predicts time-aware context embedding and uses the Gumble-Top-K to select the closest candidate edges to this context embedding.
Experiments on temporal link prediction benchmarks demonstrate that TGSL yields significant gains for the popular TGNs such as TGAT and GraphMixer.
arXiv Detail & Related papers (2023-06-13T11:34:36Z) - Deep Temporal Graph Clustering [77.02070768950145]
We propose a general framework for deep Temporal Graph Clustering (GC)
GC introduces deep clustering techniques to suit the interaction sequence-based batch-processing pattern of temporal graphs.
Our framework can effectively improve the performance of existing temporal graph learning methods.
arXiv Detail & Related papers (2023-05-18T06:17:50Z) - TodyNet: Temporal Dynamic Graph Neural Network for Multivariate Time
Series Classification [6.76723360505692]
We propose a novel temporal dynamic neural graph network (TodyNet) that can extract hidden-temporal dependencies without undefined graph structure.
The experiments on 26 UEA benchmark datasets illustrate that the proposed TodyNet outperforms existing deep learning-based methods in the MTSC tasks.
arXiv Detail & Related papers (2023-04-11T09:21:28Z) - Graph-Free Learning in Graph-Structured Data: A More Efficient and
Accurate Spatiotemporal Learning Perspective [11.301939428860404]
This paper proposes a Graph-Free (SGF) learning module on normalization for capturing spatial correlations in graphtemporal learning.
Rigorous theoretical proof demonstrates that the time complexity is significantly better than that proposed graph convolution operation.
arXiv Detail & Related papers (2023-01-27T14:26:11Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Multivariate Time Series Classification with Hierarchical Variational
Graph Pooling [23.66868187446734]
Existing deep learning-based MTSC techniques are primarily concerned with the temporal dependency of single time series.
We propose a novel graph pooling-based framework MTPool to obtain the expressive global representation of MTS.
Experiments on ten benchmark datasets exhibit MTPool outperforms state-of-the-art strategies in the MTSC task.
arXiv Detail & Related papers (2020-10-12T12:36:47Z) - From Static to Dynamic Node Embeddings [61.58641072424504]
We introduce a general framework for leveraging graph stream data for temporal prediction-based applications.
Our proposed framework includes novel methods for learning an appropriate graph time-series representation.
We find that the top-3 temporal models are always those that leverage the new $epsilon$-graph time-series representation.
arXiv Detail & Related papers (2020-09-21T16:48:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.