Pre-Training on Dynamic Graph Neural Networks
- URL: http://arxiv.org/abs/2102.12380v1
- Date: Wed, 24 Feb 2021 16:06:32 GMT
- Title: Pre-Training on Dynamic Graph Neural Networks
- Authors: Jiajun Zhang, Kejia Chen, Yunyun Wang
- Abstract summary: This paper proposes a pre-training method on dynamic graph neural networks (PT-DGNN)
It uses dynamic attributed graph generation tasks to simultaneously learn the structure, semantics, and evolution features of the graph.
- Score: 26.139844652756334
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The pre-training on the graph neural network model can learn the general
features of large-scale networks or networks of the same type by
self-supervised methods, which allows the model to work even when node labels
are missing. However, the existing pre-training methods do not take network
evolution into consideration. This paper proposes a pre-training method on
dynamic graph neural networks (PT-DGNN), which uses dynamic attributed graph
generation tasks to simultaneously learn the structure, semantics, and
evolution features of the graph. The method includes two steps: 1) dynamic
sub-graph sampling, and 2) pre-training with dynamic attributed graph
generation task. Comparative experiments on three realistic dynamic network
datasets show that the proposed method achieves the best results on the link
prediction fine-tuning task.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Node-Time Conditional Prompt Learning In Dynamic Graphs [14.62182210205324]
We propose DYGPROMPT, a novel pre-training and prompt learning framework for dynamic graph modeling.
We recognize that node and time features mutually characterize each other, and propose dual condition-nets to model the evolving node-time patterns in downstream tasks.
arXiv Detail & Related papers (2024-05-22T19:10:24Z) - Tensor Graph Convolutional Network for Dynamic Graph Representation
Learning [12.884025972321316]
Dynamic graphs (DG) describe dynamic interactions between entities in many practical scenarios.
Most existing DG representation learning models combine graph convolutional network and sequence neural network.
We propose a tensor graph convolutional network to learn DG representations in one convolution framework.
arXiv Detail & Related papers (2024-01-13T12:49:56Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Temporal Graph Network Embedding with Causal Anonymous Walks
Representations [54.05212871508062]
We propose a novel approach for dynamic network representation learning based on Temporal Graph Network.
For evaluation, we provide a benchmark pipeline for the evaluation of temporal network embeddings.
We show the applicability and superior performance of our model in the real-world downstream graph machine learning task provided by one of the top European banks.
arXiv Detail & Related papers (2021-08-19T15:39:52Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - EvoNet: A Neural Network for Predicting the Evolution of Dynamic Graphs [26.77596449192451]
We propose a model that predicts the evolution of dynamic graphs.
Specifically, we use a graph neural network along with a recurrent architecture to capture the temporal evolution patterns of dynamic graphs.
We evaluate the proposed model on several artificial datasets following common network evolving dynamics, as well as on real-world datasets.
arXiv Detail & Related papers (2020-03-02T12:59:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.