Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction
- URL: http://arxiv.org/abs/2602.12613v1
- Date: Fri, 13 Feb 2026 04:39:42 GMT
- Title: Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction
- Authors: Zulun Zhu, Siqiang Luo,
- Abstract summary: Temporal Graph Neural Networks (TGNNs) are pivotal in processing dynamic graphs.<n>Existing TGNNs primarily target one-time predictions for a given temporal span, whereas many practical applications require continuous predictions, that predictions are issued frequently over time.<n>This paper revisits the challenge of continuous predictions in TGNNs, and introduces sc Coden, a TGNN model designed for efficient and effective learning on dynamic graphs.
- Score: 15.907958072038904
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal Graph Neural Networks (TGNNs) are pivotal in processing dynamic graphs. However, existing TGNNs primarily target one-time predictions for a given temporal span, whereas many practical applications require continuous predictions, that predictions are issued frequently over time. Directly adapting existing TGNNs to continuous-prediction scenarios introduces either significant computational overhead or prediction quality issues especially for large graphs. This paper revisits the challenge of { continuous predictions} in TGNNs, and introduces {\sc Coden}, a TGNN model designed for efficient and effective learning on dynamic graphs. {\sc Coden} innovatively overcomes the key complexity bottleneck in existing TGNNs while preserving comparable predictive accuracy. Moreover, we further provide theoretical analyses that substantiate the effectiveness and efficiency of {\sc Coden}, and clarify its duality relationship with both RNN-based and attention-based models. Our evaluations across five dynamic datasets show that {\sc Coden} surpasses existing performance benchmarks in both efficiency and effectiveness, establishing it as a superior solution for continuous prediction in evolving graph environments.
Related papers
- Test-time GNN Model Evaluation on Dynamic Graphs [52.31268996286955]
We propose a Dynamic Graph neural network Evaluator, dubbed DyGEval, to address this new problem.<n>The proposed DyGEval involves a two-stage framework: (1) test-time dynamic graph simulation, which captures the training-test distributional differences as supervision signals and trains an evaluator; and (2) DyGEval development and training, which accurately estimates the performance of the well-trained DGNN model on the test-time dynamic graphs.
arXiv Detail & Related papers (2025-09-28T11:40:37Z) - Non-exchangeable Conformal Prediction for Temporal Graph Neural Networks [11.01716974299811]
Conformal prediction for graph neural networks (GNNs) offers a promising framework for quantifying uncertainty.<n>Existing methods predominantly focus on static graphs, neglecting the evolving nature of real-world graphs.<n>We introduce NCPNET, a novel end-to-end conformal prediction framework tailored for temporal graphs.
arXiv Detail & Related papers (2025-07-02T21:15:00Z) - PPT-GNN: A Practical Pre-Trained Spatio-Temporal Graph Neural Network for Network Security [3.0558245652654907]
PPTGNN is a practical, large-scale pre-trained model for intrusion detection.
It enables near real-time predictions, while better capturing thetemporal dynamics of network attacks.
It significantly outperforms state-of-the-art models, such as E-ResGAT and E-GraphSAGE, with an average accuracy improvement of 10.38%.
arXiv Detail & Related papers (2024-06-19T09:09:46Z) - SIG: Efficient Self-Interpretable Graph Neural Network for Continuous-time Dynamic Graphs [34.269958289295516]
We aim to predict future links within the dynamic graph while simultaneously providing causal explanations for these predictions.
To tackle these challenges, we propose a novel causal inference model, namely the Independent and Confounded Causal Model (ICCM)
Our proposed model significantly outperforms existing methods across link prediction accuracy, explanation quality, and robustness to shortcut features.
arXiv Detail & Related papers (2024-05-29T13:09:33Z) - Dynamic Graph Unlearning: A General and Efficient Post-Processing Method via Gradient Transformation [24.20087360102464]
We study the dynamic graph unlearning for the first time and propose an effective, efficient, general, and post-processing method to implement DGNN unlearning.<n>Our method has the potential to handle future unlearning requests with significant performance gains.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Graph-enabled Reinforcement Learning for Time Series Forecasting with
Adaptive Intelligence [11.249626785206003]
We propose a novel approach for predicting time-series data using Graphical neural network (GNN) and monitoring with Reinforcement Learning (RL)
GNNs are able to explicitly incorporate the graph structure of the data into the model, allowing them to capture temporal dependencies in a more natural way.
This approach allows for more accurate predictions in complex temporal structures, such as those found in healthcare, traffic and weather forecasting.
arXiv Detail & Related papers (2023-09-18T22:25:12Z) - Uncertainty Quantification over Graph with Conformalized Graph Neural
Networks [52.20904874696597]
Graph Neural Networks (GNNs) are powerful machine learning prediction models on graph-structured data.
GNNs lack rigorous uncertainty estimates, limiting their reliable deployment in settings where the cost of errors is significant.
We propose conformalized GNN (CF-GNN), extending conformal prediction (CP) to graph-based models for guaranteed uncertainty estimates.
arXiv Detail & Related papers (2023-05-23T21:38:23Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.