Input Snapshots Fusion for Scalable Discrete-Time Dynamic Graph Neural Networks
- URL: http://arxiv.org/abs/2405.06975v2
- Date: Wed, 12 Feb 2025 04:48:53 GMT
- Title: Input Snapshots Fusion for Scalable Discrete-Time Dynamic Graph Neural Networks
- Authors: QingGuo Qi, Hongyang Chen, Minhao Cheng, Han Liu,
- Abstract summary: We propose SFDyG, which combines Hawkes processes with graph neural networks to capture temporal and structural patterns in dynamic graphs effectively.
By fusing multiple snapshots into a single temporal graph, SFDyG decouples computational complexity from the number of snapshots, enabling efficient full-batch and mini-batch training.
- Score: 27.616083395612595
- License:
- Abstract: In recent years, there has been a surge in research on dynamic graph representation learning, primarily focusing on modeling the evolution of temporal-spatial patterns in real-world applications. However, within the domain of discrete-time dynamic graphs, the exploration of temporal edges remains underexplored. Existing approaches often rely on additional sequential models to capture dynamics, leading to high computational and memory costs, particularly for large-scale graphs. To address this limitation, we propose the Input {\bf S}napshots {\bf F}usion based {\bf Dy}namic {\bf G}raph Neural Network (SFDyG), which combines Hawkes processes with graph neural networks to capture temporal and structural patterns in dynamic graphs effectively. By fusing multiple snapshots into a single temporal graph, SFDyG decouples computational complexity from the number of snapshots, enabling efficient full-batch and mini-batch training. Experimental evaluations on eight diverse dynamic graph datasets for future link prediction tasks demonstrate that SFDyG consistently outperforms existing methods.
Related papers
- ScaDyG:A New Paradigm for Large-scale Dynamic Graph Learning [31.629956388962814]
ScaDyG is a time-aware scalable learning paradigm for dynamic graph networks.
experiments on 12 datasets demonstrate that ScaDyG performs comparably well or even outperforms other SOTA methods in both node and link-level downstream tasks.
arXiv Detail & Related papers (2025-01-27T12:39:16Z) - DyG-Mamba: Continuous State Space Modeling on Dynamic Graphs [59.434893231950205]
Dynamic graph learning aims to uncover evolutionary laws in real-world systems.
We propose DyG-Mamba, a new continuous state space model for dynamic graph learning.
We show that DyG-Mamba achieves state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2024-08-13T15:21:46Z) - State Space Models on Temporal Graphs: A First-Principles Study [30.531930200222423]
Research on deep graph learning has shifted from static graphs to temporal graphs in response to real-world complex systems that exhibit dynamic behaviors.
Sequence models such as RNNs or Transformers have long been the predominant backbone networks for modeling such temporal graphs.
We develop GraphSSM, a graph state space model for modeling the dynamics of temporal graphs.
arXiv Detail & Related papers (2024-06-03T02:56:11Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Decoupled Graph Neural Networks for Large Dynamic Graphs [14.635923016087503]
We propose a decoupled graph neural network for large dynamic graphs.
We show that our algorithm achieves state-of-the-art performance in both kinds of dynamic graphs.
arXiv Detail & Related papers (2023-05-14T23:00:10Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Time-aware Random Walk Diffusion to Improve Dynamic Graph Learning [3.4012007729454816]
TiaRa is a novel diffusion-based method for augmenting a dynamic graph represented as a discrete-time sequence of graph snapshots.
We show that TiaRa effectively augments a given dynamic graph, and leads to significant improvements in dynamic GNN models for various graph datasets and tasks.
arXiv Detail & Related papers (2022-11-02T15:55:46Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Temporal Graph Networks for Deep Learning on Dynamic Graphs [4.5158585619109495]
We present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events.
Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient.
arXiv Detail & Related papers (2020-06-18T16:06:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.