CPDG: A Contrastive Pre-Training Method for Dynamic Graph Neural
Networks
- URL: http://arxiv.org/abs/2307.02813v3
- Date: Sun, 24 Dec 2023 05:56:49 GMT
- Title: CPDG: A Contrastive Pre-Training Method for Dynamic Graph Neural
Networks
- Authors: Yuanchen Bei, Hao Xu, Sheng Zhou, Huixuan Chi, Haishuai Wang, Mengdi
Zhang, Zhao Li, Jiajun Bu
- Abstract summary: We propose Contrastive Pre-Training Method for Dynamic Graph Neural Networks (CPDG)
CPDG tackles the challenges of pre-training for DGNNs, including generalization capability and long-short term modeling capability.
Extensive experiments conducted on both large-scale research and industrial dynamic graph datasets.
- Score: 21.79251709065902
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamic graph data mining has gained popularity in recent years due to the
rich information contained in dynamic graphs and their widespread use in the
real world. Despite the advances in dynamic graph neural networks (DGNNs), the
rich information and diverse downstream tasks have posed significant
difficulties for the practical application of DGNNs in industrial scenarios. To
this end, in this paper, we propose to address them by pre-training and present
the Contrastive Pre-Training Method for Dynamic Graph Neural Networks (CPDG).
CPDG tackles the challenges of pre-training for DGNNs, including generalization
capability and long-short term modeling capability, through a flexible
structural-temporal subgraph sampler along with structural-temporal contrastive
pre-training schemes. Extensive experiments conducted on both large-scale
research and industrial dynamic graph datasets show that CPDG outperforms
existing methods in dynamic graph pre-training for various downstream tasks
under three transfer settings.
Related papers
- ReInc: Scaling Training of Dynamic Graph Neural Networks [6.1592549031654364]
ReInc is a system designed to enable efficient and scalable training of Dynamic Graph Neural Networks (DGNNs) on large-scale graphs.
We introduce key innovations that capitalize on the unique combination of Graph Neural Networks (GNNs) and Recurrent Neural Networks (RNNs) inherent in DGNNs.
arXiv Detail & Related papers (2025-01-25T23:16:03Z) - Expressivity of Representation Learning on Continuous-Time Dynamic Graphs: An Information-Flow Centric Review [2.310679096120274]
This paper provides a comprehensive review of Graph Representation Learning (GRL) on Continuous-Time Dynamic Graph (CTDG) models.
We introduce a novel theoretical framework that analyzes the expressivity of CTDG models through an Information-Flow (IF) lens.
arXiv Detail & Related papers (2024-12-05T00:12:50Z) - Node-Time Conditional Prompt Learning In Dynamic Graphs [14.62182210205324]
We propose DYGPROMPT, a novel pre-training and prompt learning framework for dynamic graph modeling.
We recognize that node and time features mutually characterize each other, and propose dual condition-nets to model the evolving node-time patterns in downstream tasks.
arXiv Detail & Related papers (2024-05-22T19:10:24Z) - A survey of dynamic graph neural networks [26.162035361191805]
Graph neural networks (GNNs) have emerged as a powerful tool for effectively mining and learning from graph-structured data.
This paper provides a comprehensive review of the fundamental concepts, key techniques, and state-of-the-art dynamic GNN models.
arXiv Detail & Related papers (2024-04-28T15:07:48Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning [92.71579608528907]
This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
arXiv Detail & Related papers (2023-03-22T06:35:08Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Data-Free Adversarial Knowledge Distillation for Graph Neural Networks [62.71646916191515]
We propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN)
To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model.
Our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.
arXiv Detail & Related papers (2022-05-08T08:19:40Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Pre-Training on Dynamic Graph Neural Networks [26.139844652756334]
This paper proposes a pre-training method on dynamic graph neural networks (PT-DGNN)
It uses dynamic attributed graph generation tasks to simultaneously learn the structure, semantics, and evolution features of the graph.
arXiv Detail & Related papers (2021-02-24T16:06:32Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.