Heterogeneous Graph Contrastive Multi-view Learning
- URL: http://arxiv.org/abs/2210.00248v1
- Date: Sat, 1 Oct 2022 10:53:48 GMT
- Title: Heterogeneous Graph Contrastive Multi-view Learning
- Authors: Zehong Wang, Qi Li, Donghua Yu, Xiaolong Han, Xiao-Zhi Gao, Shigen
Shen
- Abstract summary: Graph contrastive learning (GCL) has been developed to learn discriminative node representations on graph datasets.
We propose a novel Heterogeneous Graph Contrastive Multi-view Learning (HGCML) model.
HGCML consistently outperforms state-of-the-art baselines on five real-world benchmark datasets.
- Score: 11.489983916543805
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Inspired by the success of contrastive learning (CL) in computer vision and
natural language processing, graph contrastive learning (GCL) has been
developed to learn discriminative node representations on graph datasets.
However, the development of GCL on Heterogeneous Information Networks (HINs) is
still in the infant stage. For example, it is unclear how to augment the HINs
without substantially altering the underlying semantics, and how to design the
contrastive objective to fully capture the rich semantics. Moreover, early
investigations demonstrate that CL suffers from sampling bias, whereas
conventional debiasing techniques are empirically shown to be inadequate for
GCL. How to mitigate the sampling bias for heterogeneous GCL is another
important problem. To address the aforementioned challenges, we propose a novel
Heterogeneous Graph Contrastive Multi-view Learning (HGCML) model. In
particular, we use metapaths as the augmentation to generate multiple subgraphs
as multi-views, and propose a contrastive objective to maximize the mutual
information between any pairs of metapath-induced views. To alleviate the
sampling bias, we further propose a positive sampling strategy to explicitly
select positives for each node via jointly considering semantic and structural
information preserved on each metapath view. Extensive experiments demonstrate
HGCML consistently outperforms state-of-the-art baselines on five real-world
benchmark datasets.
Related papers
- LAMP: Learnable Meta-Path Guided Adversarial Contrastive Learning for Heterogeneous Graphs [22.322402072526927]
Heterogeneous Graph Contrastive Learning (HGCL) usually requires pre-defined meta-paths.
textsfLAMP integrates various meta-path sub-graphs into a unified and stable structure.
textsfLAMP significantly outperforms existing state-of-the-art unsupervised models in terms of accuracy and robustness.
arXiv Detail & Related papers (2024-09-10T08:27:39Z) - Topology Reorganized Graph Contrastive Learning with Mitigating Semantic Drift [28.83750578838018]
Graph contrastive learning (GCL) is an effective paradigm for node representation learning in graphs.
To increase the diversity of the contrastive view, we propose two simple and effective global topological augmentations to compensate current GCL.
arXiv Detail & Related papers (2024-07-23T13:55:33Z) - Generative-Enhanced Heterogeneous Graph Contrastive Learning [11.118517297006894]
Heterogeneous Graphs (HGs) can effectively model complex relationships in the real world by multi-type nodes and edges.
In recent years, inspired by self-supervised learning, contrastive Heterogeneous Graphs Neural Networks (HGNNs) have shown great potential by utilizing data augmentation and contrastive discriminators for downstream tasks.
We propose a novel Generative-Enhanced Heterogeneous Graph Contrastive Learning (GHGCL)
arXiv Detail & Related papers (2024-04-03T15:31:18Z) - Graph-level Protein Representation Learning by Structure Knowledge
Refinement [50.775264276189695]
This paper focuses on learning representation on the whole graph level in an unsupervised manner.
We propose a novel framework called Structure Knowledge Refinement (SKR) which uses data structure to determine the probability of whether a pair is positive or negative.
arXiv Detail & Related papers (2024-01-05T09:05:33Z) - M2HGCL: Multi-Scale Meta-Path Integrated Heterogeneous Graph Contrastive
Learning [16.391439666603578]
We propose a new multi-scale meta-path integrated heterogeneous graph contrastive learning (M2HGCL) model.
Specifically, we expand the meta-paths and jointly aggregate the direct neighbor information, the initial meta-path neighbor information and the expanded meta-path neighbor information.
Through extensive experiments on three real-world datasets, we demonstrate that M2HGCL outperforms the current state-of-the-art baseline models.
arXiv Detail & Related papers (2023-09-03T06:39:56Z) - HomoGCL: Rethinking Homophily in Graph Contrastive Learning [64.85392028383164]
HomoGCL is a model-agnostic framework to expand the positive set using neighbor nodes with neighbor-specific significances.
We show that HomoGCL yields multiple state-of-the-art results across six public datasets.
arXiv Detail & Related papers (2023-06-16T04:06:52Z) - Single-Pass Contrastive Learning Can Work for Both Homophilic and
Heterophilic Graph [60.28340453547902]
Graph contrastive learning (GCL) techniques typically require two forward passes for a single instance to construct the contrastive loss.
Existing GCL approaches fail to provide strong performance guarantees.
We implement the Single-Pass Graph Contrastive Learning method (SP-GCL)
Empirically, the features learned by the SP-GCL can match or outperform existing strong baselines with significantly less computational overhead.
arXiv Detail & Related papers (2022-11-20T07:18:56Z) - Unifying Graph Contrastive Learning with Flexible Contextual Scopes [57.86762576319638]
We present a self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short)
Our algorithm builds flexible contextual representations with contextual scopes by controlling the power of an adjacency matrix.
Based on representations from both local and contextual scopes, distL optimises a very simple contrastive loss function for graph representation learning.
arXiv Detail & Related papers (2022-10-17T07:16:17Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Stacked Hybrid-Attention and Group Collaborative Learning for Unbiased
Scene Graph Generation [62.96628432641806]
Scene Graph Generation aims to first encode the visual contents within the given image and then parse them into a compact summary graph.
We first present a novel Stacked Hybrid-Attention network, which facilitates the intra-modal refinement as well as the inter-modal interaction.
We then devise an innovative Group Collaborative Learning strategy to optimize the decoder.
arXiv Detail & Related papers (2022-03-18T09:14:13Z) - An Empirical Study of Graph Contrastive Learning [17.246488437677616]
Graph Contrastive Learning establishes a new paradigm for learning graph representations without human annotations.
We identify several critical design considerations within a general GCL paradigm, including augmentation functions, contrasting modes, contrastive objectives, and negative mining techniques.
To foster future research and ease the implementation of GCL algorithms, we develop an easy-to-use library PyGCL, featuring modularized CL components, standardized evaluation, and experiment management.
arXiv Detail & Related papers (2021-09-02T17:43:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.