On the Power of Heuristics in Temporal Graphs
- URL: http://arxiv.org/abs/2502.04910v1
- Date: Fri, 07 Feb 2025 13:28:31 GMT
- Title: On the Power of Heuristics in Temporal Graphs
- Authors: Filip Cornell, Oleg Smirnov, Gabriela Zarzar Gandler, Lele Cao,
- Abstract summary: We introduce metrics that quantify the impact of recency and popularity across datasets.
Results emphasize the importance of refined evaluation schemes to enable fair comparisons and promote the development of more robust temporal graph models.
- Score: 2.5957835343537266
- License:
- Abstract: Dynamic graph datasets often exhibit strong temporal patterns, such as recency, which prioritizes recent interactions, and popularity, which favors frequently occurring nodes. We demonstrate that simple heuristics leveraging only these patterns can perform on par or outperform state-of-the-art neural network models under standard evaluation protocols. To further explore these dynamics, we introduce metrics that quantify the impact of recency and popularity across datasets. Our experiments on BenchTemp and the Temporal Graph Benchmark show that our approaches achieve state-of-the-art performance across all datasets in the latter and secure top ranks on multiple datasets in the former. These results emphasize the importance of refined evaluation schemes to enable fair comparisons and promote the development of more robust temporal graph models. Additionally, they reveal that current deep learning methods often struggle to capture the key patterns underlying predictions in real-world temporal graphs. For reproducibility, we have made our code publicly available.
Related papers
- A Deep Probabilistic Framework for Continuous Time Dynamic Graph Generation [4.568104644312763]
We formalize this approach as DG-Gen, a generative framework for continuous time dynamic graphs.
Our experiments demonstrate that DG-Gen not only generates higher fidelity graphs compared to traditional methods but also significantly advances link prediction tasks.
arXiv Detail & Related papers (2024-12-20T05:34:11Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Temporal graph models fail to capture global temporal dynamics [0.43512163406552007]
We propose a trivial optimization-free baseline of "recently popular nodes"
We show how standard negative sampling evaluation can be unsuitable for datasets with strong temporal dynamics.
Our results indicate that temporal graph network architectures need deep rethinking for usage in problems with significant global dynamics.
arXiv Detail & Related papers (2023-09-27T15:36:45Z) - From random-walks to graph-sprints: a low-latency node embedding
framework on continuous-time dynamic graphs [4.372841335228306]
We propose a framework for continuous-time-dynamic-graphs (CTDGs) that has low latency and is competitive with state-of-the-art, higher latency models.
In our framework, time-aware node embeddings summarizing multi-hop information are computed using only single-hop operations on the incoming edges.
We demonstrate that our graph-sprints features, combined with a machine learning, achieve competitive performance.
arXiv Detail & Related papers (2023-07-17T12:25:52Z) - Temporal Graph Benchmark for Machine Learning on Temporal Graphs [54.52243310226456]
Temporal Graph Benchmark (TGB) is a collection of challenging and diverse benchmark datasets.
We benchmark each dataset and find that the performance of common models can vary drastically across datasets.
TGB provides an automated machine learning pipeline for reproducible and accessible temporal graph research.
arXiv Detail & Related papers (2023-07-03T13:58:20Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Taming Local Effects in Graph-based Spatiotemporal Forecasting [28.30604130617646]
Stemporal graph neural networks have shown to be effective in time series forecasting applications.
This paper aims to understand the interplay between globality and locality in graph-basedtemporal forecasting.
We propose a methodological framework to rationalize the practice of including trainable node embeddings in such architectures.
arXiv Detail & Related papers (2023-02-08T14:18:56Z) - A Graph-Enhanced Click Model for Web Search [67.27218481132185]
We propose a novel graph-enhanced click model (GraphCM) for web search.
We exploit both intra-session and inter-session information for the sparsity and cold-start problems.
arXiv Detail & Related papers (2022-06-17T08:32:43Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Catastrophic Forgetting in Deep Graph Networks: an Introductory
Benchmark for Graph Classification [12.423303337249795]
We study the phenomenon of catastrophic forgetting in the graph representation learning scenario.
We find that replay is the most effective strategy in so far, which also benefits the most from the use of regularization.
arXiv Detail & Related papers (2021-03-22T12:07:21Z) - From Static to Dynamic Node Embeddings [61.58641072424504]
We introduce a general framework for leveraging graph stream data for temporal prediction-based applications.
Our proposed framework includes novel methods for learning an appropriate graph time-series representation.
We find that the top-3 temporal models are always those that leverage the new $epsilon$-graph time-series representation.
arXiv Detail & Related papers (2020-09-21T16:48:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.