Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning
- URL: http://arxiv.org/abs/2009.07111v2
- Date: Sat, 19 Sep 2020 02:06:28 GMT
- Title: Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning
- Authors: Sheng Wan and Shirui Pan and Jian Yang and Chen Gong
- Abstract summary: Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph.
A novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure.
- Score: 64.98816284854067
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a
handful of labeled data to the remaining massive unlabeled data via a graph. As
one of the most popular graph-based SSL approaches, the recently proposed Graph
Convolutional Networks (GCNs) have gained remarkable progress by combining the
sound expressiveness of neural networks with graph structure. Nevertheless, the
existing graph-based methods do not directly address the core problem of SSL,
i.e., the shortage of supervision, and thus their performances are still very
limited. To accommodate this issue, a novel GCN-based SSL algorithm is
presented in this paper to enrich the supervision signals by utilizing both
data similarities and graph structure. Firstly, by designing a semi-supervised
contrastive loss, improved node representations can be generated via maximizing
the agreement between different views of the same data or the data from the
same class. Therefore, the rich unlabeled data and the scarce yet valuable
labeled data can jointly provide abundant supervision information for learning
discriminative node representations, which helps improve the subsequent
classification result. Secondly, the underlying determinative relationship
between the data features and input graph topology is extracted as
supplementary supervision signals for SSL via using a graph generative loss
related to the input features. Intensive experimental results on a variety of
real-world datasets firmly verify the effectiveness of our algorithm compared
with other state-of-the-art methods.
Related papers
- Self-Supervised Conditional Distribution Learning on Graphs [15.730933577970687]
We present an end-to-end graph representation learning model to align the conditional distributions of weakly and strongly augmented features over the original features.
This alignment effectively reduces the risk of disrupting intrinsic semantic information through graph-structured data augmentation.
arXiv Detail & Related papers (2024-11-20T07:26:36Z) - Deep Contrastive Graph Learning with Clustering-Oriented Guidance [61.103996105756394]
Graph Convolutional Network (GCN) has exhibited remarkable potential in improving graph-based clustering.
Models estimate an initial graph beforehand to apply GCN.
Deep Contrastive Graph Learning (DCGL) model is proposed for general data clustering.
arXiv Detail & Related papers (2024-02-25T07:03:37Z) - ExGRG: Explicitly-Generated Relation Graph for Self-Supervised Representation Learning [4.105236597768038]
Self-supervised learning has emerged as a powerful technique in pre-training deep learning models.
This paper introduces a novel non-contrastive SSL approach to Explicitly Generate a compositional Relation Graph.
arXiv Detail & Related papers (2024-02-09T19:16:04Z) - Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Sub-graph Contrast for Scalable Self-Supervised Graph Representation
Learning [21.0019144298605]
Existing graph neural networks fed with the complete graph data are not scalable due to limited computation and memory costs.
textscSubg-Con is proposed by utilizing the strong correlation between central nodes and their sampled subgraphs to capture regional structure information.
Compared with existing graph representation learning approaches, textscSubg-Con has prominent performance advantages in weaker supervision requirements, model learning scalability, and parallelization.
arXiv Detail & Related papers (2020-09-22T01:58:19Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.