Scaling Up Dynamic Graph Representation Learning via Spiking Neural
Networks
- URL: http://arxiv.org/abs/2208.10364v3
- Date: Thu, 18 May 2023 11:57:10 GMT
- Title: Scaling Up Dynamic Graph Representation Learning via Spiking Neural
Networks
- Authors: Jintang Li, Zhouxin Yu, Zulun Zhu, Liang Chen, Qi Yu, Zibin Zheng,
Sheng Tian, Ruofan Wu, Changhua Meng
- Abstract summary: We present a scalable framework, namely SpikeNet, to efficiently capture the temporal and structural patterns of temporal graphs.
As a low-power alternative to RNNs, SNNs explicitly model graph dynamics as spike trains of neuron populations.
SpikeNet generalizes to a large temporal graph with significantly fewer parameters and computation overheads.
- Score: 23.01100055999135
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent years have seen a surge in research on dynamic graph representation
learning, which aims to model temporal graphs that are dynamic and evolving
constantly over time. However, current work typically models graph dynamics
with recurrent neural networks (RNNs), making them suffer seriously from
computation and memory overheads on large temporal graphs. So far, scalability
of dynamic graph representation learning on large temporal graphs remains one
of the major challenges. In this paper, we present a scalable framework, namely
SpikeNet, to efficiently capture the temporal and structural patterns of
temporal graphs. We explore a new direction in that we can capture the evolving
dynamics of temporal graphs with spiking neural networks (SNNs) instead of
RNNs. As a low-power alternative to RNNs, SNNs explicitly model graph dynamics
as spike trains of neuron populations and enable spike-based propagation in an
efficient way. Experiments on three large real-world temporal graph datasets
demonstrate that SpikeNet outperforms strong baselines on the temporal node
classification task with lower computational costs. Particularly, SpikeNet
generalizes to a large temporal graph (2.7M nodes and 13.9M edges) with
significantly fewer parameters and computation overheads.Our code is publicly
available at \url{https://github.com/EdisonLeeeee/SpikeNet}.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Dynamic Spiking Framework for Graph Neural Networks [26.08442716817432]
We present a framework named underlineDynamic underlineSpunderlineiking underlineGraph underlineNeural Networks (method) to mitigate the information loss problem.
Experiments on three large-scale real-world dynamic graph validate the effectiveness of method.
arXiv Detail & Related papers (2023-12-15T12:45:47Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Dynamic Graph Node Classification via Time Augmentation [15.580277876084873]
We propose the Time Augmented Graph Dynamic Neural Network (TADGNN) framework for node classification on dynamic graphs.
TADGNN consists of two modules: 1) a time augmentation module that captures the temporal evolution of nodes across time structurally, creating a time-augmentedtemporal graph, and 2) an information propagation module that learns the dynamic representations for each node across time using the constructed time-augmented graph.
Experimental results demonstrate that TADGNN framework outperforms several static and dynamic state-of-the-art (SOTA) GNN models while demonstrating superior scalability.
arXiv Detail & Related papers (2022-12-07T04:13:23Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Instant Graph Neural Networks for Dynamic Graphs [18.916632816065935]
We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
arXiv Detail & Related papers (2022-06-03T03:27:42Z) - Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network [2.0047096160313456]
Dynamic graph neural networks have received more and more attention from researchers.
We propose a novel dynamic graph neural network, Efficient-Dyn.
It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure.
arXiv Detail & Related papers (2022-01-04T23:52:24Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Learning to Evolve on Dynamic Graphs [5.1521870302904125]
Learning to Evolve on Dynamic Graphs (LEDG) is a novel algorithm that jointly learns graph information and time information.
LEDG is model-agnostic and can train any message passing based graph neural network (GNN) on dynamic graphs.
arXiv Detail & Related papers (2021-11-13T04:09:30Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.