Self-supervised Training of Graph Convolutional Networks
- URL: http://arxiv.org/abs/2006.02380v1
- Date: Wed, 3 Jun 2020 16:53:37 GMT
- Title: Self-supervised Training of Graph Convolutional Networks
- Authors: Qikui Zhu, Bo Du, Pingkun Yan
- Abstract summary: Graph Convolutional Networks (GCNs) have been successfully applied to analyze non-grid data.
In this paper, we propose two types of self-supervised learning strategies to exploit available information from the input graph structure data itself.
- Score: 39.80867112204255
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Graph Convolutional Networks (GCNs) have been successfully applied to analyze
non-grid data, where the classical convolutional neural networks (CNNs) cannot
be directly used. One similarity shared by GCNs and CNNs is the requirement of
massive amount of labeled data for network training. In addition, GCNs need the
adjacency matrix as input to define the relationship between those non-grid
data, which leads to all of data including training, validation and test data
typically forms only one graph structures data for training. Furthermore, the
adjacency matrix is usually pre-defined and stationary, which makes the data
augmentation strategies cannot be employed on the constructed graph structures
data to augment the amount of training data. To further improve the learning
capacity and model performance under the limited training data, in this paper,
we propose two types of self-supervised learning strategies to exploit
available information from the input graph structure data itself. Our proposed
self-supervised learning strategies are examined on two representative GCN
models with three public citation network datasets - Citeseer, Cora and Pubmed.
The experimental results demonstrate the generalization ability as well as the
portability of our proposed strategies, which can significantly improve the
performance of GCNs with the power of self-supervised learning in improving
feature learning.
Related papers
- Deep Contrastive Graph Learning with Clustering-Oriented Guidance [61.103996105756394]
Graph Convolutional Network (GCN) has exhibited remarkable potential in improving graph-based clustering.
Models estimate an initial graph beforehand to apply GCN.
Deep Contrastive Graph Learning (DCGL) model is proposed for general data clustering.
arXiv Detail & Related papers (2024-02-25T07:03:37Z) - ABC: Aggregation before Communication, a Communication Reduction
Framework for Distributed Graph Neural Network Training and Effective
Partition [0.0]
Graph Neural Networks (GNNs) are neural models tailored for graph-structure data and have shown superior performance in learning representations for graph-structured data.
In this paper, we study the communication complexity during distributed GNNs training.
We show that the new partition paradigm is particularly ideal in the case of dynamic graphs where it is infeasible to control the edge placement due to the unknown of the graph-changing process.
arXiv Detail & Related papers (2022-12-11T04:54:01Z) - Data-Free Adversarial Knowledge Distillation for Graph Neural Networks [62.71646916191515]
We propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN)
To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model.
Our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.
arXiv Detail & Related papers (2022-05-08T08:19:40Z) - Graph-Based Neural Network Models with Multiple Self-Supervised
Auxiliary Tasks [79.28094304325116]
Graph Convolutional Networks are among the most promising approaches for capturing relationships among structured data points.
We propose three novel self-supervised auxiliary tasks to train graph-based neural network models in a multi-task fashion.
arXiv Detail & Related papers (2020-11-14T11:09:51Z) - Self-supervised Graph Representation Learning via Bootstrapping [35.56360622521721]
We propose a new self-supervised graph representation method: deep graph bootstrapping(DGB)
DGB consists of two neural networks: online and target networks, and the input of them are different augmented views of the initial graph.
As a result, the proposed DGB can learn graph representation without negative examples in an unsupervised manner.
arXiv Detail & Related papers (2020-11-10T14:47:29Z) - Graph Information Bottleneck [77.21967740646784]
Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features.
Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task.
We show that our proposed models are more robust than state-of-the-art graph defense models.
arXiv Detail & Related papers (2020-10-24T07:13:00Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - An Uncoupled Training Architecture for Large Graph Learning [20.784230322205232]
We present Node2Grids, a flexible uncoupled training framework for embedding graph data into grid-like data.
By ranking each node's influence through degree, Node2Grids selects the most influential first-order as well as second-order neighbors with central node fusion information.
For further improving the efficiency of downstream tasks, a simple CNN-based neural network is employed to capture the significant information from the mapped grid-like data.
arXiv Detail & Related papers (2020-03-21T11:49:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.