Hetero$^2$Net: Heterophily-aware Representation Learning on
Heterogenerous Graphs
- URL: http://arxiv.org/abs/2310.11664v1
- Date: Wed, 18 Oct 2023 02:19:12 GMT
- Title: Hetero$^2$Net: Heterophily-aware Representation Learning on
Heterogenerous Graphs
- Authors: Jintang Li, Zheng Wei, Jiawang Dan, Jing Zhou, Yuchang Zhu, Ruofan Wu,
Baokun Wang, Zhang Zhen, Changhua Meng, Hong Jin, Zibin Zheng, Liang Chen
- Abstract summary: We present Hetero$2$Net, a heterophily-aware HGNN that incorporates both masked metapath prediction and masked label prediction tasks.
We evaluate the performance of Hetero$2$Net on five real-world heterogeneous graph benchmarks with varying levels of heterophily.
- Score: 38.858702539146385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world graphs are typically complex, exhibiting heterogeneity in the
global structure, as well as strong heterophily within local neighborhoods.
While a growing body of literature has revealed the limitations of common graph
neural networks (GNNs) in handling homogeneous graphs with heterophily, little
work has been conducted on investigating the heterophily properties in the
context of heterogeneous graphs. To bridge this research gap, we identify the
heterophily in heterogeneous graphs using metapaths and propose two practical
metrics to quantitatively describe the levels of heterophily. Through in-depth
investigations on several real-world heterogeneous graphs exhibiting varying
levels of heterophily, we have observed that heterogeneous graph neural
networks (HGNNs), which inherit many mechanisms from GNNs designed for
homogeneous graphs, fail to generalize to heterogeneous graphs with heterophily
or low level of homophily. To address the challenge, we present Hetero$^2$Net,
a heterophily-aware HGNN that incorporates both masked metapath prediction and
masked label prediction tasks to effectively and flexibly handle both
homophilic and heterophilic heterogeneous graphs. We evaluate the performance
of Hetero$^2$Net on five real-world heterogeneous graph benchmarks with varying
levels of heterophily. The results demonstrate that Hetero$^2$Net outperforms
strong baselines in the semi-supervised node classification task, providing
valuable insights into effectively handling more complex heterogeneous graphs.
Related papers
- Addressing Heterogeneity and Heterophily in Graphs: A Heterogeneous Heterophilic Spectral Graph Neural Network [48.05273145974434]
We propose a Heterogeneous Heterophilic Spectral Graph Neural Network (H2SGNN)
H2SGNN employs a dual-module approach: local independent filtering and global hybrid filtering.
Extensive empirical evaluations on four real-world datasets demonstrate the superiority of H2SGNN compared to state-of-the-art methods.
arXiv Detail & Related papers (2024-10-17T09:23:53Z) - When Heterophily Meets Heterogeneous Graphs: Latent Graphs Guided Unsupervised Representation Learning [6.2167203720326025]
Unsupervised heterogeneous graph representation learning (UHGRL) has gained increasing attention due to its significance in handling practical graphs without labels.
We define semantic heterophily and propose an innovative framework called Latent Graphs Guided Unsupervised Representation Learning (LatGRL) to handle this problem.
arXiv Detail & Related papers (2024-09-01T10:25:06Z) - When Heterophily Meets Heterogeneity: New Graph Benchmarks and Effective Methods [20.754843684170034]
H2GB is a novel graph benchmark that brings together the complexities of both the heterophily and heterogeneous properties of graphs.
Our benchmark encompasses 9 diverse real-world datasets across 5 domains, 28 baseline model implementations, and 26 benchmark results.
We present a modular graph transformer framework UnifiedGT and a new model variant, H2G-former, that excels at this challenging benchmark.
arXiv Detail & Related papers (2024-07-15T17:18:42Z) - The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Demystifying Structural Disparity in Graph Neural Networks: Can One Size
Fit All? [61.35457647107439]
Most real-world homophilic and heterophilic graphs are comprised of a mixture of nodes in both homophilic and heterophilic structural patterns.
We provide evidence that Graph Neural Networks(GNNs) on node classification typically perform admirably on homophilic nodes.
We then propose a rigorous, non-i.i.d PAC-Bayesian generalization bound for GNNs, revealing reasons for the performance disparity.
arXiv Detail & Related papers (2023-06-02T07:46:20Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Hybrid Micro/Macro Level Convolution for Heterogeneous Graph Learning [45.14314180743549]
Heterogeneous graphs are pervasive in practical scenarios, where each graph consists of multiple types of nodes and edges.
Most of the existing graph convolution approaches were designed for homogeneous graphs, and therefore cannot handle heterogeneous graphs.
We propose HGConv, a novel Heterogeneous Graph Convolution approach, to learn comprehensive node representations on heterogeneous graphs.
arXiv Detail & Related papers (2020-12-29T12:12:37Z) - Heterogeneous Graph Transformer [49.675064816860505]
Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs.
To handle dynamic heterogeneous graphs, we introduce the relative temporal encoding technique into HGT.
To handle Web-scale graph data, we design the heterogeneous mini-batch graph sampling algorithm---HGSampling---for efficient and scalable training.
arXiv Detail & Related papers (2020-03-03T04:49:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.