ROLAND: Graph Learning Framework for Dynamic Graphs
- URL: http://arxiv.org/abs/2208.07239v1
- Date: Mon, 15 Aug 2022 14:51:47 GMT
- Title: ROLAND: Graph Learning Framework for Dynamic Graphs
- Authors: Jiaxuan You, Tianyu Du, Jure Leskovec
- Abstract summary: Graph Neural Networks (GNNs) have been successfully applied to many real-world static graphs.
Existing dynamic GNNs do not incorporate state-of-the-art designs from static GNNs.
We propose ROLAND, an effective graph representation learning framework for real-world dynamic graphs.
- Score: 75.96510058864463
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have been successfully applied to many
real-world static graphs. However, the success of static graphs has not fully
translated to dynamic graphs due to the limitations in model design, evaluation
settings, and training strategies. Concretely, existing dynamic GNNs do not
incorporate state-of-the-art designs from static GNNs, which limits their
performance. Current evaluation settings for dynamic GNNs do not fully reflect
the evolving nature of dynamic graphs. Finally, commonly used training methods
for dynamic GNNs are not scalable. Here we propose ROLAND, an effective graph
representation learning framework for real-world dynamic graphs. At its core,
the ROLAND framework can help researchers easily repurpose any static GNN to
dynamic graphs. Our insight is to view the node embeddings at different GNN
layers as hierarchical node states and then recurrently update them over time.
We then introduce a live-update evaluation setting for dynamic graphs that
mimics real-world use cases, where GNNs are making predictions and being
updated on a rolling basis. Finally, we propose a scalable and efficient
training approach for dynamic GNNs via incremental training and meta-learning.
We conduct experiments over eight different dynamic graph datasets on future
link prediction tasks. Models built using the ROLAND framework achieve on
average 62.7% relative mean reciprocal rank (MRR) improvement over
state-of-the-art baselines under the standard evaluation settings on three
datasets. We find state-of-the-art baselines experience out-of-memory errors
for larger datasets, while ROLAND can easily scale to dynamic graphs with 56
million edges. After re-implementing these baselines using the ROLAND training
strategy, ROLAND models still achieve on average 15.5% relative MRR improvement
over the baselines.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Dynamic Graph Node Classification via Time Augmentation [15.580277876084873]
We propose the Time Augmented Graph Dynamic Neural Network (TADGNN) framework for node classification on dynamic graphs.
TADGNN consists of two modules: 1) a time augmentation module that captures the temporal evolution of nodes across time structurally, creating a time-augmentedtemporal graph, and 2) an information propagation module that learns the dynamic representations for each node across time using the constructed time-augmented graph.
Experimental results demonstrate that TADGNN framework outperforms several static and dynamic state-of-the-art (SOTA) GNN models while demonstrating superior scalability.
arXiv Detail & Related papers (2022-12-07T04:13:23Z) - Instant Graph Neural Networks for Dynamic Graphs [18.916632816065935]
We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
arXiv Detail & Related papers (2022-06-03T03:27:42Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Learning to Evolve on Dynamic Graphs [5.1521870302904125]
Learning to Evolve on Dynamic Graphs (LEDG) is a novel algorithm that jointly learns graph information and time information.
LEDG is model-agnostic and can train any message passing based graph neural network (GNN) on dynamic graphs.
arXiv Detail & Related papers (2021-11-13T04:09:30Z) - Node Feature Extraction by Self-Supervised Multi-scale Neighborhood
Prediction [123.20238648121445]
We propose a new self-supervised learning framework, Graph Information Aided Node feature exTraction (GIANT)
GIANT makes use of the eXtreme Multi-label Classification (XMC) formalism, which is crucial for fine-tuning the language model based on graph information.
We demonstrate the superior performance of GIANT over the standard GNN pipeline on Open Graph Benchmark datasets.
arXiv Detail & Related papers (2021-10-29T19:55:12Z) - DyGCN: Dynamic Graph Embedding with Graph Convolutional Network [25.02329024926518]
We propose an efficient dynamic graph embedding approach, Dynamic Graph Convolutional Network (DyGCN)
Our model can update the node embeddings in a time-saving and performance-preserving way.
arXiv Detail & Related papers (2021-04-07T07:28:44Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.