Self-supervised Heterogeneous Graph Neural Network with Co-contrastive
Learning
- URL: http://arxiv.org/abs/2105.09111v1
- Date: Wed, 19 May 2021 13:15:03 GMT
- Title: Self-supervised Heterogeneous Graph Neural Network with Co-contrastive
Learning
- Authors: Xiao Wang, Nian Liu, Hui Han, Chuan Shi
- Abstract summary: Heterogeneous graph neural networks (HGNNs) as an emerging technique have shown superior capacity of dealing with heterogeneous information network (HIN)
In this paper, we study the problem of self-supervised HGNNs and propose a novel co-contrastive learning mechanism for HGNNs, named HeCo.
- Score: 38.062495223111355
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous graph neural networks (HGNNs) as an emerging technique have
shown superior capacity of dealing with heterogeneous information network
(HIN). However, most HGNNs follow a semi-supervised learning manner, which
notably limits their wide use in reality since labels are usually scarce in
real applications. Recently, contrastive learning, a self-supervised method,
becomes one of the most exciting learning paradigms and shows great potential
when there are no labels. In this paper, we study the problem of
self-supervised HGNNs and propose a novel co-contrastive learning mechanism for
HGNNs, named HeCo. Different from traditional contrastive learning which only
focuses on contrasting positive and negative samples, HeCo employs
cross-viewcontrastive mechanism. Specifically, two views of a HIN (network
schema and meta-path views) are proposed to learn node embeddings, so as to
capture both of local and high-order structures simultaneously. Then the
cross-view contrastive learning, as well as a view mask mechanism, is proposed,
which is able to extract the positive and negative embeddings from two views.
This enables the two views to collaboratively supervise each other and finally
learn high-level node embeddings. Moreover, two extensions of HeCo are designed
to generate harder negative samples with high quality, which further boosts the
performance of HeCo. Extensive experiments conducted on a variety of real-world
networks show the superior performance of the proposed methods over the
state-of-the-arts.
Related papers
- Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - Hierarchical Contrastive Learning Enhanced Heterogeneous Graph Neural
Network [59.860534520941485]
Heterogeneous graph neural networks (HGNNs) as an emerging technique have shown superior capacity of dealing with heterogeneous information network (HIN)
Recently, contrastive learning, a self-supervised method, becomes one of the most exciting learning paradigms and shows great potential when there are no labels.
In this paper, we study the problem of self-supervised HGNNs and propose a novel co-contrastive learning mechanism for HGNNs, named HeCo.
arXiv Detail & Related papers (2023-04-24T16:17:21Z) - Task-Agnostic Graph Neural Network Evaluation via Adversarial
Collaboration [11.709808788756966]
GraphAC is a principled, task-agnostic, and stable framework for evaluating Graph Neural Network (GNN) research for molecular representation learning.
We introduce a novel objective function: the Competitive Barlow Twins, that allow two GNNs to jointly update themselves from direct competitions against each other.
arXiv Detail & Related papers (2023-01-27T03:33:11Z) - STERLING: Synergistic Representation Learning on Bipartite Graphs [78.86064828220613]
A fundamental challenge of bipartite graph representation learning is how to extract node embeddings.
Most recent bipartite graph SSL methods are based on contrastive learning which learns embeddings by discriminating positive and negative node pairs.
We introduce a novel synergistic representation learning model (STERLING) to learn node embeddings without negative node pairs.
arXiv Detail & Related papers (2023-01-25T03:21:42Z) - RHCO: A Relation-aware Heterogeneous Graph Neural Network with
Contrastive Learning for Large-scale Graphs [26.191673964156585]
We propose a novel Relation-aware Heterogeneous Graph Neural Network with Contrastive Learning (RHCO) for large-scale heterogeneous graph representation learning.
RHCO achieves best performance over the state-of-the-art models.
arXiv Detail & Related papers (2022-11-20T04:45:04Z) - Self-supervised Heterogeneous Graph Pre-training Based on Structural
Clustering [20.985559149384795]
We present SHGP, a novel Self-supervised Heterogeneous Graph Pre-training approach.
It does not need to generate any positive examples or negative examples.
It is superior to state-of-the-art unsupervised baselines and even semi-supervised baselines.
arXiv Detail & Related papers (2022-10-19T10:55:48Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Stacked Hybrid-Attention and Group Collaborative Learning for Unbiased
Scene Graph Generation [62.96628432641806]
Scene Graph Generation aims to first encode the visual contents within the given image and then parse them into a compact summary graph.
We first present a novel Stacked Hybrid-Attention network, which facilitates the intra-modal refinement as well as the inter-modal interaction.
We then devise an innovative Group Collaborative Learning strategy to optimize the decoder.
arXiv Detail & Related papers (2022-03-18T09:14:13Z) - Deep Graph Contrastive Representation Learning [23.37786673825192]
We propose a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level.
Specifically, we generate two graph views by corruption and learn node representations by maximizing the agreement of node representations in these two views.
We perform empirical experiments on both transductive and inductive learning tasks using a variety of real-world datasets.
arXiv Detail & Related papers (2020-06-07T11:50:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.