Input Snapshots Fusion for Scalable Discrete Dynamic Graph Nerual Networks
- URL: http://arxiv.org/abs/2405.06975v1
- Date: Sat, 11 May 2024 10:05:55 GMT
- Title: Input Snapshots Fusion for Scalable Discrete Dynamic Graph Nerual Networks
- Authors: QingGuo Qi, Hongyang Chen, Minhao Cheng, Han Liu,
- Abstract summary: We introduce an Input bf Snapshots bf Fusion based bf Dynamic bf Graph Neural Network (SFDyG)
By eliminating the partitioning of snapshots within the input window, we obtain a multi-graph (more than one edge between two nodes)
We propose a scalable three-step mini-batch training method and demonstrate its equivalence to full-batch training counterpart.
- Score: 27.616083395612595
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamic graphs are ubiquitous in the real world, yet there is a lack of suitable theoretical frameworks to effectively extend existing static graph models into the temporal domain. Additionally, for link prediction tasks on discrete dynamic graphs, the requirement of substantial GPU memory to store embeddings of all nodes hinders the scalability of existing models. In this paper, we introduce an Input {\bf S}napshots {\bf F}usion based {\bf Dy}namic {\bf G}raph Neural Network (SFDyG). By eliminating the partitioning of snapshots within the input window, we obtain a multi-graph (more than one edge between two nodes). Subsequently, by introducing a graph denoising problem with the assumption of temporal decayed smoothing, we integrate Hawkes process theory into Graph Neural Networks to model the generated multi-graph. Furthermore, based on the multi-graph, we propose a scalable three-step mini-batch training method and demonstrate its equivalence to full-batch training counterpart. Our experiments, conducted on eight distinct dynamic graph datasets for future link prediction tasks, revealed that SFDyG generally surpasses related methods.
Related papers
- ScaDyG:A New Paradigm for Large-scale Dynamic Graph Learning [31.629956388962814]
ScaDyG is a time-aware scalable learning paradigm for dynamic graph networks.
experiments on 12 datasets demonstrate that ScaDyG performs comparably well or even outperforms other SOTA methods in both node and link-level downstream tasks.
arXiv Detail & Related papers (2025-01-27T12:39:16Z) - DyG-Mamba: Continuous State Space Modeling on Dynamic Graphs [59.434893231950205]
Dynamic graph learning aims to uncover evolutionary laws in real-world systems.
We propose DyG-Mamba, a new continuous state space model for dynamic graph learning.
We show that DyG-Mamba achieves state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2024-08-13T15:21:46Z) - State Space Models on Temporal Graphs: A First-Principles Study [30.531930200222423]
Research on deep graph learning has shifted from static graphs to temporal graphs in response to real-world complex systems that exhibit dynamic behaviors.
Sequence models such as RNNs or Transformers have long been the predominant backbone networks for modeling such temporal graphs.
We develop GraphSSM, a graph state space model for modeling the dynamics of temporal graphs.
arXiv Detail & Related papers (2024-06-03T02:56:11Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Decoupled Graph Neural Networks for Large Dynamic Graphs [14.635923016087503]
We propose a decoupled graph neural network for large dynamic graphs.
We show that our algorithm achieves state-of-the-art performance in both kinds of dynamic graphs.
arXiv Detail & Related papers (2023-05-14T23:00:10Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Time-aware Random Walk Diffusion to Improve Dynamic Graph Learning [3.4012007729454816]
TiaRa is a novel diffusion-based method for augmenting a dynamic graph represented as a discrete-time sequence of graph snapshots.
We show that TiaRa effectively augments a given dynamic graph, and leads to significant improvements in dynamic GNN models for various graph datasets and tasks.
arXiv Detail & Related papers (2022-11-02T15:55:46Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Temporal Graph Networks for Deep Learning on Dynamic Graphs [4.5158585619109495]
We present Temporal Graph Networks (TGNs), a generic, efficient framework for deep learning on dynamic graphs represented as sequences of timed events.
Thanks to a novel combination of memory modules and graph-based operators, TGNs are able to significantly outperform previous approaches being at the same time more computationally efficient.
arXiv Detail & Related papers (2020-06-18T16:06:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.