When Speed meets Accuracy: an Efficient and Effective Graph Model for Temporal Link Prediction
- URL: http://arxiv.org/abs/2507.13825v1
- Date: Fri, 18 Jul 2025 11:29:15 GMT
- Title: When Speed meets Accuracy: an Efficient and Effective Graph Model for Temporal Link Prediction
- Authors: Haoyang Li, Yuming Xu, Yiming Li, Hanmo Liu, Darian Li, Chen Jason Zhang, Lei Chen, Qing Li,
- Abstract summary: Temporal Graph Neural Networks (T-GNNs) have achieved notable success by leveraging complex architectures to model temporal and structural dependencies.<n>We propose a lightweight framework that integrates short-term temporal recency and long-term global structural patterns.
- Score: 20.093092172339286
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal link prediction in dynamic graphs is a critical task with applications in diverse domains such as social networks, recommendation systems, and e-commerce platforms. While existing Temporal Graph Neural Networks (T-GNNs) have achieved notable success by leveraging complex architectures to model temporal and structural dependencies, they often suffer from scalability and efficiency challenges due to high computational overhead. In this paper, we propose EAGLE, a lightweight framework that integrates short-term temporal recency and long-term global structural patterns. EAGLE consists of a time-aware module that aggregates information from a node's most recent neighbors to reflect its immediate preferences, and a structure-aware module that leverages temporal personalized PageRank to capture the influence of globally important nodes. To balance these attributes, EAGLE employs an adaptive weighting mechanism to dynamically adjust their contributions based on data characteristics. Also, EAGLE eliminates the need for complex multi-hop message passing or memory-intensive mechanisms, enabling significant improvements in efficiency. Extensive experiments on seven real-world temporal graphs demonstrate that EAGLE consistently achieves superior performance against state-of-the-art T-GNNs in both effectiveness and efficiency, delivering more than a 50x speedup over effective transformer-based T-GNNs.
Related papers
- DWAFM: Dynamic Weighted Graph Structure Embedding Integrated with Attention and Frequency-Domain MLPs for Traffic Forecasting [12.788467568098817]
This letter proposes a novel dynamic weighted graph structure (DWGS) embedding method.<n>It relies on a graph structure that can truly reflect the changes in the strength of dynamic associations between nodes over time.<n> Experiments on five real-world traffic datasets show that the DWAFM achieves better prediction performance than some state-of-the-arts methods.
arXiv Detail & Related papers (2026-03-01T08:50:41Z) - Global-Lens Transformers: Adaptive Token Mixing for Dynamic Link Prediction [9.234363752442915]
We propose GLFormer, a novel attention-free Transformer-style framework for dynamic graphs.<n>Experiments on six widely-used dynamic graph benchmarks show that GLFormer achieves SOTA performance, which reveals that attention-free architectures can match or surpass Transformer baselines.
arXiv Detail & Related papers (2025-11-16T04:05:56Z) - Dynamic Generation of Multi-LLM Agents Communication Topologies with Graph Diffusion Models [99.85131798240808]
We introduce a novel generative framework called textitGuided Topology Diffusion (GTD)<n>Inspired by conditional discrete graph diffusion models, GTD formulates topology synthesis as an iterative construction process.<n>At each step, the generation is steered by a lightweight proxy model that predicts multi-objective rewards.<n>Experiments show that GTD can generate highly task-adaptive, sparse, and efficient communication topologies.
arXiv Detail & Related papers (2025-10-09T05:28:28Z) - GILT: An LLM-Free, Tuning-Free Graph Foundational Model for In-Context Learning [50.40400074353263]
Graph Neural Networks (GNNs) are powerful tools for precessing relational data but often struggle to generalize to unseen graphs.<n>We introduce textbfGraph textbfIn-context textbfL textbfTransformer (GILT), a framework built on an LLM-free and tuning-free architecture.
arXiv Detail & Related papers (2025-10-06T08:09:15Z) - Decoupling Spatio-Temporal Prediction: When Lightweight Large Models Meet Adaptive Hypergraphs [12.867023510751787]
STH-SepNet is a novel framework that decouples temporal and spatial expressiveness to both efficiency and precision.<n>S-SepNet offers a pragmatic and scalable solution for temporal prediction in real-world applications.<n>This work may provide a promising lightweight framework for temporal prediction, aiming to reduce computational demands and while enhancing predictive performance.
arXiv Detail & Related papers (2025-05-26T07:37:39Z) - Higher-order Structure Boosts Link Prediction on Temporal Graphs [33.483289891869426]
Temporal Graph Neural Networks (TGNNs) have gained growing attention for modeling and predicting structures in temporal graphs.<n>We propose a Higher-order structure Temporal Graph Neural Network, which incorporates hypergraph representations into temporal graph learning.<n>We show that HTGN achieves superior performance on dynamic link prediction while reducing memory costs by up to 50% compared to existing methods.
arXiv Detail & Related papers (2025-05-21T16:51:44Z) - Instruction-Guided Autoregressive Neural Network Parameter Generation [49.800239140036496]
We propose IGPG, an autoregressive framework that unifies parameter synthesis across diverse tasks and architectures.<n>By autoregressively generating neural network weights' tokens, IGPG ensures inter-layer coherence and enables efficient adaptation across models and datasets.<n>Experiments on multiple datasets demonstrate that IGPG consolidates diverse pretrained models into a single, flexible generative framework.
arXiv Detail & Related papers (2025-04-02T05:50:19Z) - RGL: A Graph-Centric, Modular Framework for Efficient Retrieval-Augmented Generation on Graphs [58.10503898336799]
We introduce the RAG-on-Graphs Library (RGL), a modular framework that seamlessly integrates the complete RAG pipeline.<n>RGL addresses key challenges by supporting a variety of graph formats and integrating optimized implementations for essential components.<n>Our evaluations demonstrate that RGL not only accelerates the prototyping process but also enhances the performance and applicability of graph-based RAG systems.
arXiv Detail & Related papers (2025-03-25T03:21:48Z) - RelGNN: Composite Message Passing for Relational Deep Learning [56.48834369525997]
We introduce RelGNN, a novel GNN framework specifically designed to leverage the unique structural characteristics of the graphs built from relational databases.<n>RelGNN is evaluated on 30 diverse real-world tasks from Relbench (Fey et al., 2024), and achieves state-of-the-art performance on the vast majority tasks, with improvements of up to 25%.
arXiv Detail & Related papers (2025-02-10T18:58:40Z) - ScaDyG:A New Paradigm for Large-scale Dynamic Graph Learning [31.629956388962814]
ScaDyG is a time-aware scalable learning paradigm for dynamic graph networks.<n> experiments on 12 datasets demonstrate that ScaDyG performs comparably well or even outperforms other SOTA methods in both node and link-level downstream tasks.
arXiv Detail & Related papers (2025-01-27T12:39:16Z) - SIG: Efficient Self-Interpretable Graph Neural Network for Continuous-time Dynamic Graphs [34.269958289295516]
We aim to predict future links within the dynamic graph while simultaneously providing causal explanations for these predictions.
To tackle these challenges, we propose a novel causal inference model, namely the Independent and Confounded Causal Model (ICCM)
Our proposed model significantly outperforms existing methods across link prediction accuracy, explanation quality, and robustness to shortcut features.
arXiv Detail & Related papers (2024-05-29T13:09:33Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - PREM: A Simple Yet Effective Approach for Node-Level Graph Anomaly
Detection [65.24854366973794]
Node-level graph anomaly detection (GAD) plays a critical role in identifying anomalous nodes from graph-structured data in domains such as medicine, social networks, and e-commerce.
We introduce a simple method termed PREprocessing and Matching (PREM for short) to improve the efficiency of GAD.
Our approach streamlines GAD, reducing time and memory consumption while maintaining powerful anomaly detection capabilities.
arXiv Detail & Related papers (2023-10-18T02:59:57Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - Automated Dilated Spatio-Temporal Synchronous Graph Modeling for Traffic
Prediction [1.6449390849183363]
We propose an automated dilated-temporal synchronous graph network prediction named Auto-DSTS for traffic prediction.
Specifically, we propose an automated dilated-temporal-temporal graph (Auto-DSTS) module to capture the short-term and long-term-temporal correlations.
Our model can achieve about 10% improvements compared with the state-of-art methods.
arXiv Detail & Related papers (2022-07-22T00:50:39Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.