Heterogeneous Graph Masked Contrastive Learning for Robust Recommendation
- URL: http://arxiv.org/abs/2505.24172v1
- Date: Fri, 30 May 2025 03:32:26 GMT
- Title: Heterogeneous Graph Masked Contrastive Learning for Robust Recommendation
- Authors: Lei Sang, Yu Wang, Yiwen Zhang,
- Abstract summary: We introduce a novel model, named Masked Contrastive Learning (MCL), to enhance recommendation robustness to noise.<n>MCL employs a random masking strategy to augment the graph via meta-paths, reducing node sensitivity to specific neighbors and bolstering embedding robustness.<n> Empirical evaluations on three real-world datasets confirm the superiority of our approach over existing recommendation methods.
- Score: 8.711556540753774
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous graph neural networks (HGNNs) have demonstrated their superiority in exploiting auxiliary information for recommendation tasks. However, graphs constructed using meta-paths in HGNNs are usually too dense and contain a large number of noise edges. The propagation mechanism of HGNNs propagates even small amounts of noise in a graph to distant neighboring nodes, thereby affecting numerous node embeddings. To address this limitation, we introduce a novel model, named Masked Contrastive Learning (MCL), to enhance recommendation robustness to noise. MCL employs a random masking strategy to augment the graph via meta-paths, reducing node sensitivity to specific neighbors and bolstering embedding robustness. Furthermore, MCL employs contrastive cross-view on a Heterogeneous Information Network (HIN) from two perspectives: one-hop neighbors and meta-path neighbors. This approach acquires embeddings capturing both local and high-order structures simultaneously for recommendation. Empirical evaluations on three real-world datasets confirm the superiority of our approach over existing recommendation methods.
Related papers
- NLGCL: Naturally Existing Neighbor Layers Graph Contrastive Learning for Recommendation [13.817514358795982]
We propose NLGCL, a novel contrastive learning framework for Graph Neural Networks (GNNs)<n>By treating each node and its neighbors in the next layer as positive pairs, and other nodes as negatives, NLGCL avoids augmentation-based noise while preserving semantic relevance.<n>This paradigm eliminates costly view construction and storage, making it computationally efficient and practical for real-world scenarios.
arXiv Detail & Related papers (2025-07-10T08:12:39Z) - Multi-Granular Attention based Heterogeneous Hypergraph Neural Network [5.580244361093485]
Heterogeneous graph neural networks (HeteGNNs) have demonstrated strong abilities to learn node representations.<n>This paper proposes MGA-HHN, a Multi-Granular Attention based Heterogeneous Hypergraph Neural Network for representation learning.
arXiv Detail & Related papers (2025-05-07T11:42:00Z) - Cluster-based Graph Collaborative Filtering [55.929052969825825]
Graph Convolution Networks (GCNs) have succeeded in learning user and item representations for recommendation systems.
Most existing GCN-based methods overlook the multiple interests of users while performing high-order graph convolution.
We propose a novel GCN-based recommendation model, termed Cluster-based Graph Collaborative Filtering (ClusterGCF)
arXiv Detail & Related papers (2024-04-16T07:05:16Z) - Structural Imbalance Aware Graph Augmentation Learning [2.793446335600599]
Graphs are often structurally imbalanced, that is, only a few hub nodes have a denser local structure and higher influence.
This paper proposes a selective graph augmentation method (SAug) to solve this problem.
Extensive experiments demonstrate that SAug can significantly improve the backbone GNNs and achieve superior performance to its competitors.
arXiv Detail & Related papers (2023-03-24T02:13:32Z) - Mixed Graph Contrastive Network for Semi-Supervised Node Classification [63.924129159538076]
We propose a novel graph contrastive learning method, termed Mixed Graph Contrastive Network (MGCN)<n>In our method, we improve the discriminative capability of the latent embeddings by an unperturbed augmentation strategy and a correlation reduction mechanism.<n>By combining the two settings, we extract rich supervision information from both the abundant nodes and the rare yet valuable labeled nodes for discriminative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Spatial Autoregressive Coding for Graph Neural Recommendation [38.66151035948021]
shallow models and deep Graph Neural Networks (GNNs) fail to adequately exploit neighbor proximity in sampled subgraphs or sequences.
In this paper, we propose a novel framework SAC, namely Spatial Autoregressive Coding, to solve the above problems in a unified way.
Experimental results on both public recommendation datasets and a real scenario web-scale dataset demonstrate the superiority of SAC compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-05-19T12:00:01Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Detecting Communities from Heterogeneous Graphs: A Context Path-based
Graph Neural Network Model [23.525079144108567]
We build a Context Path-based Graph Neural Network (CP-GNN) model.
It embeds the high-order relationship between nodes into the node embedding.
It outperforms the state-of-the-art community detection methods.
arXiv Detail & Related papers (2021-09-05T12:28:00Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.