M2HGCL: Multi-Scale Meta-Path Integrated Heterogeneous Graph Contrastive
Learning
- URL: http://arxiv.org/abs/2309.01101v1
- Date: Sun, 3 Sep 2023 06:39:56 GMT
- Title: M2HGCL: Multi-Scale Meta-Path Integrated Heterogeneous Graph Contrastive
Learning
- Authors: Yuanyuan Guo, Yu Xia, Rui Wang, Rongcheng Duan, Lu Li, Jiangmeng Li
- Abstract summary: We propose a new multi-scale meta-path integrated heterogeneous graph contrastive learning (M2HGCL) model.
Specifically, we expand the meta-paths and jointly aggregate the direct neighbor information, the initial meta-path neighbor information and the expanded meta-path neighbor information.
Through extensive experiments on three real-world datasets, we demonstrate that M2HGCL outperforms the current state-of-the-art baseline models.
- Score: 16.391439666603578
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Inspired by the successful application of contrastive learning on graphs,
researchers attempt to impose graph contrastive learning approaches on
heterogeneous information networks. Orthogonal to homogeneous graphs, the types
of nodes and edges in heterogeneous graphs are diverse so that specialized
graph contrastive learning methods are required. Most existing methods for
heterogeneous graph contrastive learning are implemented by transforming
heterogeneous graphs into homogeneous graphs, which may lead to ramifications
that the valuable information carried by non-target nodes is undermined thereby
exacerbating the performance of contrastive learning models. Additionally,
current heterogeneous graph contrastive learning methods are mainly based on
initial meta-paths given by the dataset, yet according to our deep-going
exploration, we derive empirical conclusions: only initial meta-paths cannot
contain sufficiently discriminative information; and various types of
meta-paths can effectively promote the performance of heterogeneous graph
contrastive learning methods. To this end, we propose a new multi-scale
meta-path integrated heterogeneous graph contrastive learning (M2HGCL) model,
which discards the conventional heterogeneity-homogeneity transformation and
performs the graph contrastive learning in a joint manner. Specifically, we
expand the meta-paths and jointly aggregate the direct neighbor information,
the initial meta-path neighbor information and the expanded meta-path neighbor
information to sufficiently capture discriminative information. A specific
positive sampling strategy is further imposed to remedy the intrinsic
deficiency of contrastive learning, i.e., the hard negative sample sampling
issue. Through extensive experiments on three real-world datasets, we
demonstrate that M2HGCL outperforms the current state-of-the-art baseline
models.
Related papers
- LAMP: Learnable Meta-Path Guided Adversarial Contrastive Learning for Heterogeneous Graphs [22.322402072526927]
Heterogeneous Graph Contrastive Learning (HGCL) usually requires pre-defined meta-paths.
textsfLAMP integrates various meta-path sub-graphs into a unified and stable structure.
textsfLAMP significantly outperforms existing state-of-the-art unsupervised models in terms of accuracy and robustness.
arXiv Detail & Related papers (2024-09-10T08:27:39Z) - The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Single-Pass Contrastive Learning Can Work for Both Homophilic and
Heterophilic Graph [60.28340453547902]
Graph contrastive learning (GCL) techniques typically require two forward passes for a single instance to construct the contrastive loss.
Existing GCL approaches fail to provide strong performance guarantees.
We implement the Single-Pass Graph Contrastive Learning method (SP-GCL)
Empirically, the features learned by the SP-GCL can match or outperform existing strong baselines with significantly less computational overhead.
arXiv Detail & Related papers (2022-11-20T07:18:56Z) - Heterogeneous Graph Contrastive Multi-view Learning [11.489983916543805]
Graph contrastive learning (GCL) has been developed to learn discriminative node representations on graph datasets.
We propose a novel Heterogeneous Graph Contrastive Multi-view Learning (HGCML) model.
HGCML consistently outperforms state-of-the-art baselines on five real-world benchmark datasets.
arXiv Detail & Related papers (2022-10-01T10:53:48Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Cross-view Self-Supervised Learning on Heterogeneous Graph Neural
Network via Bootstrapping [0.0]
Heterogeneous graph neural networks can represent information of heterogeneous graphs with excellent ability.
In this paper, we introduce a that can generate good representations without generating large number of pairs.
The proposed model showed state-of-the-art performance than other methods in various real world datasets.
arXiv Detail & Related papers (2022-01-10T13:36:05Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Weakly-supervised Graph Meta-learning for Few-shot Node Classification [53.36828125138149]
We propose a new graph meta-learning framework -- Graph Hallucination Networks (Meta-GHN)
Based on a new robustness-enhanced episodic training, Meta-GHN is meta-learned to hallucinate clean node representations from weakly-labeled data.
Extensive experiments demonstrate the superiority of Meta-GHN over existing graph meta-learning studies.
arXiv Detail & Related papers (2021-06-12T22:22:10Z) - Hybrid Micro/Macro Level Convolution for Heterogeneous Graph Learning [45.14314180743549]
Heterogeneous graphs are pervasive in practical scenarios, where each graph consists of multiple types of nodes and edges.
Most of the existing graph convolution approaches were designed for homogeneous graphs, and therefore cannot handle heterogeneous graphs.
We propose HGConv, a novel Heterogeneous Graph Convolution approach, to learn comprehensive node representations on heterogeneous graphs.
arXiv Detail & Related papers (2020-12-29T12:12:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.