Instant Graph Neural Networks for Dynamic Graphs
- URL: http://arxiv.org/abs/2206.01379v1
- Date: Fri, 3 Jun 2022 03:27:42 GMT
- Title: Instant Graph Neural Networks for Dynamic Graphs
- Authors: Yanping Zheng, Hanzhi Wang, Zhewei Wei, Jiajun Liu, Sibo Wang
- Abstract summary: We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
- Score: 18.916632816065935
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have been widely used for modeling
graph-structured data. With the development of numerous GNN variants, recent
years have witnessed groundbreaking results in improving the scalability of
GNNs to work on static graphs with millions of nodes. However, how to instantly
represent continuous changes of large-scale dynamic graphs with GNNs is still
an open problem. Existing dynamic GNNs focus on modeling the periodic evolution
of graphs, often on a snapshot basis. Such methods suffer from two drawbacks:
first, there is a substantial delay for the changes in the graph to be
reflected in the graph representations, resulting in losses on the model's
accuracy; second, repeatedly calculating the representation matrix on the
entire graph in each snapshot is predominantly time-consuming and severely
limits the scalability. In this paper, we propose Instant Graph Neural Network
(InstantGNN), an incremental computation approach for the graph representation
matrix of dynamic graphs. Set to work with dynamic graphs with the edge-arrival
model, our method avoids time-consuming, repetitive computations and allows
instant updates on the representation and instant predictions. Graphs with
dynamic structures and dynamic attributes are both supported. The upper bounds
of time complexity of those updates are also provided. Furthermore, our method
provides an adaptive training strategy, which guides the model to retrain at
moments when it can make the greatest performance gains. We conduct extensive
experiments on several real-world and synthetic datasets. Empirical results
demonstrate that our model achieves state-of-the-art accuracy while having
orders-of-magnitude higher efficiency than existing methods.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - DGC: Training Dynamic Graphs with Spatio-Temporal Non-Uniformity using
Graph Partitioning by Chunks [13.279145021338534]
Dynamic Graph Neural Network (DGNN) has shown a strong capability of learning dynamic graphs by exploiting both spatial and temporal features.
We propose DGC, a distributed DGNN training system that achieves a 1.25x - 7.52x speedup over the state-of-the-art in our testbed.
arXiv Detail & Related papers (2023-09-07T07:12:59Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Decoupled Graph Neural Networks for Large Dynamic Graphs [14.635923016087503]
We propose a decoupled graph neural network for large dynamic graphs.
We show that our algorithm achieves state-of-the-art performance in both kinds of dynamic graphs.
arXiv Detail & Related papers (2023-05-14T23:00:10Z) - Dynamic Graph Node Classification via Time Augmentation [15.580277876084873]
We propose the Time Augmented Graph Dynamic Neural Network (TADGNN) framework for node classification on dynamic graphs.
TADGNN consists of two modules: 1) a time augmentation module that captures the temporal evolution of nodes across time structurally, creating a time-augmentedtemporal graph, and 2) an information propagation module that learns the dynamic representations for each node across time using the constructed time-augmented graph.
Experimental results demonstrate that TADGNN framework outperforms several static and dynamic state-of-the-art (SOTA) GNN models while demonstrating superior scalability.
arXiv Detail & Related papers (2022-12-07T04:13:23Z) - Scaling Up Dynamic Graph Representation Learning via Spiking Neural
Networks [23.01100055999135]
We present a scalable framework, namely SpikeNet, to efficiently capture the temporal and structural patterns of temporal graphs.
As a low-power alternative to RNNs, SNNs explicitly model graph dynamics as spike trains of neuron populations.
SpikeNet generalizes to a large temporal graph with significantly fewer parameters and computation overheads.
arXiv Detail & Related papers (2022-08-15T09:22:15Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Learning to Evolve on Dynamic Graphs [5.1521870302904125]
Learning to Evolve on Dynamic Graphs (LEDG) is a novel algorithm that jointly learns graph information and time information.
LEDG is model-agnostic and can train any message passing based graph neural network (GNN) on dynamic graphs.
arXiv Detail & Related papers (2021-11-13T04:09:30Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.