Addressing Heterogeneity and Heterophily in Graphs: A Heterogeneous Heterophilic Spectral Graph Neural Network
- URL: http://arxiv.org/abs/2410.13373v1
- Date: Thu, 17 Oct 2024 09:23:53 GMT
- Title: Addressing Heterogeneity and Heterophily in Graphs: A Heterogeneous Heterophilic Spectral Graph Neural Network
- Authors: Kangkang Lu, Yanhua Yu, Zhiyong Huang, Jia Li, Yuling Wang, Meiyu Liang, Xiting Qin, Yimeng Ren, Tat-Seng Chua, Xidian Wang,
- Abstract summary: We propose a Heterogeneous Heterophilic Spectral Graph Neural Network (H2SGNN)
H2SGNN employs a dual-module approach: local independent filtering and global hybrid filtering.
Extensive empirical evaluations on four real-world datasets demonstrate the superiority of H2SGNN compared to state-of-the-art methods.
- Score: 48.05273145974434
- License:
- Abstract: Graph Neural Networks (GNNs) have garnered significant scholarly attention for their powerful capabilities in modeling graph structures. Despite this, two primary challenges persist: heterogeneity and heterophily. Existing studies often address heterogeneous and heterophilic graphs separately, leaving a research gap in the understanding of heterogeneous heterophilic graphs-those that feature diverse node or relation types with dissimilar connected nodes. To address this gap, we investigate the application of spectral graph filters within heterogeneous graphs. Specifically, we propose a Heterogeneous Heterophilic Spectral Graph Neural Network (H2SGNN), which employs a dual-module approach: local independent filtering and global hybrid filtering. The local independent filtering module applies polynomial filters to each subgraph independently to adapt to different homophily, while the global hybrid filtering module captures interactions across different subgraphs. Extensive empirical evaluations on four real-world datasets demonstrate the superiority of H2SGNN compared to state-of-the-art methods.
Related papers
- The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Node-wise Filtering in Graph Neural Networks: A Mixture of Experts Approach [58.8524608686851]
Graph Neural Networks (GNNs) have proven to be highly effective for node classification tasks across diverse graph structural patterns.
Traditionally, GNNs employ a uniform global filter, typically a low-pass filter for homophilic graphs and a high-pass filter for heterophilic graphs.
We introduce a novel GNN framework Node-MoE that utilizes a mixture of experts to adaptively select the appropriate filters for different nodes.
arXiv Detail & Related papers (2024-06-05T17:12:38Z) - How Universal Polynomial Bases Enhance Spectral Graph Neural Networks: Heterophily, Over-smoothing, and Over-squashing [24.857304431611464]
Spectral Graph Networks (GNNs) have gained increasing prevalence for heterophily graphs.
In an attempt to avert prohibitive computations, numerous filters have been proposed.
We demystify the correlation between the spectral property of desired vectors and the heterophily degrees.
We develop a novel adaptive heterophily basis wherein the basis mutually form angles reflecting the heterophily degree of the graph.
arXiv Detail & Related papers (2024-05-21T03:28:45Z) - Hetero$^2$Net: Heterophily-aware Representation Learning on
Heterogenerous Graphs [38.858702539146385]
We present Hetero$2$Net, a heterophily-aware HGNN that incorporates both masked metapath prediction and masked label prediction tasks.
We evaluate the performance of Hetero$2$Net on five real-world heterogeneous graph benchmarks with varying levels of heterophily.
arXiv Detail & Related papers (2023-10-18T02:19:12Z) - GPatcher: A Simple and Adaptive MLP Model for Alleviating Graph
Heterophily [15.93465948768545]
We demystify the impact of graph heterophily on graph neural networks (GNNs) filters.
We propose a simple yet powerful GNN named GPatcher by leveraging the patch-Mixer architectures.
Our model demonstrates outstanding performance on node classification compared with popular homophily GNNs and state-of-the-art heterophily GNNs.
arXiv Detail & Related papers (2023-06-25T20:57:35Z) - Graph Contrastive Learning under Heterophily via Graph Filters [51.46061703680498]
Graph contrastive learning (CL) methods learn node representations in a self-supervised manner by maximizing the similarity between the augmented node representations obtained via a GNN-based encoder.
In this work, we propose an effective graph CL method, namely HLCL, for learning graph representations under heterophily.
Our extensive experiments show that HLCL outperforms state-of-the-art graph CL methods on benchmark datasets with heterophily, as well as large-scale real-world graphs, by up to 7%, and outperforms graph supervised learning methods on datasets with heterophily by up to 10%.
arXiv Detail & Related papers (2023-03-11T08:32:39Z) - Break the Wall Between Homophily and Heterophily for Graph
Representation Learning [25.445073413243925]
Homophily and heterophily are intrinsic properties of graphs that describe whether two linked nodes share similar properties.
This work identifies three graph features, including the ego node feature, the aggregated node feature, and the graph structure feature, that are essential for graph representation learning.
It proposes a new GNN model called OGNN that extracts all three graph features and adaptively fuses them to achieve generalizability across the whole spectrum of homophily.
arXiv Detail & Related papers (2022-10-08T19:37:03Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.