Adaptive Heterogeneous Graph Neural Networks: Bridging Heterophily and Heterogeneity
- URL: http://arxiv.org/abs/2508.06034v1
- Date: Fri, 08 Aug 2025 05:39:58 GMT
- Title: Adaptive Heterogeneous Graph Neural Networks: Bridging Heterophily and Heterogeneity
- Authors: Qin Chen, Guojie Song,
- Abstract summary: Heterogeneous graphs (HGs) are common in real-world scenarios and often exhibit heterophily.<n>We propose the Adaptive Heterogeneous Graph Neural Network (AHGNN) to tackle these challenges.<n>AHGNN employs a heterophily-aware convolution that accounts for heterophily distributions specific to both hops and meta-paths.
- Score: 20.67252453378065
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Heterogeneous graphs (HGs) are common in real-world scenarios and often exhibit heterophily. However, most existing studies focus on either heterogeneity or heterophily in isolation, overlooking the prevalence of heterophilic HGs in practical applications. Such ignorance leads to their performance degradation. In this work, we first identify two main challenges in modeling heterophily HGs: (1) varying heterophily distributions across hops and meta-paths; (2) the intricate and often heterophily-driven diversity of semantic information across different meta-paths. Then, we propose the Adaptive Heterogeneous Graph Neural Network (AHGNN) to tackle these challenges. AHGNN employs a heterophily-aware convolution that accounts for heterophily distributions specific to both hops and meta-paths. It then integrates messages from diverse semantic spaces using a coarse-to-fine attention mechanism, which filters out noise and emphasizes informative signals. Experiments on seven real-world graphs and twenty baselines demonstrate the superior performance of AHGNN, particularly in high-heterophily situations.
Related papers
- Enhancing Homophily-Heterophily Separation: Relation-Aware Learning in Heterogeneous Graphs [14.452589880736523]
We propose Relation-Aware Separation of Homophily and Heterophily (RASH), a novel contrastive learning framework.<n>RASH explicitly models high-order semantics of heterogeneous interactions and adaptively separates homophilic and heterophilic patterns.<n>A multi-relation contrastive loss is designed to align heterogeneous and homophilic/heterophilic views by maximizing mutual information.
arXiv Detail & Related papers (2025-06-26T03:54:06Z) - Raising the Bar in Graph OOD Generalization: Invariant Learning Beyond Explicit Environment Modeling [61.222803636132554]
Real-world graph data often exhibit diverse and shifting environments that traditional models fail to generalize across.<n>We propose a novel method termed Multi-Prototype Hyperspherical Invariant Learning (MPHIL)<n>MPHIL achieves state-of-the-art performance, significantly outperforming existing methods across graph data from various domains and with different distribution shifts.
arXiv Detail & Related papers (2025-02-15T07:40:14Z) - Homophily-aware Heterogeneous Graph Contrastive Learning [23.38883104104888]
We propose a novel heterogeneous graph contrastive learning framework, termed HGMS, to learn homophilous node representations.<n> Specifically, we design a heterogeneous edge dropping augmentation strategy that enhances the homophily of augmented views.<n>In practice, we develop two approaches to solve the self-expressive matrix.
arXiv Detail & Related papers (2025-01-15T02:56:50Z) - THeGCN: Temporal Heterophilic Graph Convolutional Network [51.25112923442657]
We propose the Temporal Heterophilic Graph Convolutional Network (THeGCN) to accurately capture both edge (spatial) heterophily and temporal heterophily.<n>The THeGCN model consists of two key components: a sampler and an aggregator.<n>Extensive experiments conducted on 5 real-world datasets validate the efficacy of THeGCN.
arXiv Detail & Related papers (2024-12-21T01:52:03Z) - Addressing Graph Heterogeneity and Heterophily from A Spectral Perspective [46.37860909753809]
Heterogeneity refers to a graph with multiple types of nodes or edges, while heterophily refers to the fact that connected nodes are more likely to have dissimilar attributes or labels.<n>We propose a Heterogeneous Heterophilic Spectral Graph Neural Network (H2SGNN), which employs two modules: local independent filtering and global hybrid filtering.<n> Extensive experiments are conducted on four datasets to validate the effectiveness of the proposed H2SGNN.
arXiv Detail & Related papers (2024-10-17T09:23:53Z) - The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Heterophilous Distribution Propagation for Graph Neural Networks [23.897535976924722]
We propose heterophilous distribution propagation (HDP) for graph neural networks.
Instead of aggregating information from all neighborhoods, HDP adaptively separates the neighbors into homophilous and heterphilous parts.
We conduct extensive experiments on 9 benchmark datasets with different levels of homophily.
arXiv Detail & Related papers (2024-05-31T06:40:56Z) - Hetero$^2$Net: Heterophily-aware Representation Learning on
Heterogenerous Graphs [38.858702539146385]
We present Hetero$2$Net, a heterophily-aware HGNN that incorporates both masked metapath prediction and masked label prediction tasks.
We evaluate the performance of Hetero$2$Net on five real-world heterogeneous graph benchmarks with varying levels of heterophily.
arXiv Detail & Related papers (2023-10-18T02:19:12Z) - Demystifying Structural Disparity in Graph Neural Networks: Can One Size
Fit All? [61.35457647107439]
Most real-world homophilic and heterophilic graphs are comprised of a mixture of nodes in both homophilic and heterophilic structural patterns.
We provide evidence that Graph Neural Networks(GNNs) on node classification typically perform admirably on homophilic nodes.
We then propose a rigorous, non-i.i.d PAC-Bayesian generalization bound for GNNs, revealing reasons for the performance disparity.
arXiv Detail & Related papers (2023-06-02T07:46:20Z) - Auto-HeG: Automated Graph Neural Network on Heterophilic Graphs [62.665761463233736]
We propose an automated graph neural network on heterophilic graphs, namely Auto-HeG, to automatically build heterophilic GNN models.
Specifically, Auto-HeG incorporates heterophily into all stages of automatic heterophilic graph learning, including search space design, supernet training, and architecture selection.
arXiv Detail & Related papers (2023-02-23T22:49:56Z) - Heterophily-Aware Graph Attention Network [42.640057865981156]
Graph Neural Networks (GNNs) have shown remarkable success in graph representation learning.
Existing heterophilic GNNs tend to ignore the modeling of heterophily of each edge, which is also a vital part in tackling the heterophily problem.
We propose a novel Heterophily-Aware Graph Attention Network (HA-GAT) by fully exploring and utilizing the local distribution as the underlying heterophily.
arXiv Detail & Related papers (2023-02-07T03:21:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.