Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily
- URL: http://arxiv.org/abs/2203.11200v3
- Date: Sun, 16 Apr 2023 04:21:34 GMT
- Title: Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily
- Authors: Jie Chen, Shouzhen Chen, Junbin Gao, Zengfeng Huang, Junping Zhang and
Jian Pu
- Abstract summary: We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
- Score: 58.76759997223951
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Due to the homophily assumption in graph convolution networks (GNNs), a
common consensus in the graph node classification task is that GNNs perform
well on homophilic graphs but may fail on heterophilic graphs with many
inter-class edges. However, the previous inter-class edges perspective and
related homo-ratio metrics cannot well explain the GNNs performance under some
heterophilic datasets, which implies that not all the inter-class edges are
harmful to GNNs. In this work, we propose a new metric based on von Neumann
entropy to re-examine the heterophily problem of GNNs and investigate the
feature aggregation of inter-class edges from an entire neighbor identifiable
perspective. Moreover, we propose a simple yet effective Conv-Agnostic GNN
framework (CAGNNs) to enhance the performance of most GNNs on heterophily
datasets by learning the neighbor effect for each node. Specifically, we first
decouple the feature of each node into the discriminative feature for
downstream tasks and the aggregation feature for graph convolution. Then, we
propose a shared mixer module to adaptively evaluate the neighbor effect of
each node to incorporate the neighbor information. The proposed framework can
be regarded as a plug-in component and is compatible with most GNNs. The
experimental results over nine well-known benchmark datasets indicate that our
framework can significantly improve performance, especially for the heterophily
graphs. The average performance gain is 9.81%, 25.81%, and 20.61% compared with
GIN, GAT, and GCN, respectively. Extensive ablation studies and robustness
analysis further verify the effectiveness, robustness, and interpretability of
our framework. Code is available at https://github.com/JC-202/CAGNN.
Related papers
- The Heterophilic Snowflake Hypothesis: Training and Empowering GNNs for Heterophilic Graphs [59.03660013787925]
We introduce the Heterophily Snowflake Hypothesis and provide an effective solution to guide and facilitate research on heterophilic graphs.
Our observations show that our framework acts as a versatile operator for diverse tasks.
It can be integrated into various GNN frameworks, boosting performance in-depth and offering an explainable approach to choosing the optimal network depth.
arXiv Detail & Related papers (2024-06-18T12:16:00Z) - Cluster-based Graph Collaborative Filtering [55.929052969825825]
Graph Convolution Networks (GCNs) have succeeded in learning user and item representations for recommendation systems.
Most existing GCN-based methods overlook the multiple interests of users while performing high-order graph convolution.
We propose a novel GCN-based recommendation model, termed Cluster-based Graph Collaborative Filtering (ClusterGCF)
arXiv Detail & Related papers (2024-04-16T07:05:16Z) - ES-GNN: Generalizing Graph Neural Networks Beyond Homophily with Edge Splitting [32.69196871253339]
We propose a novel Edge Splitting GNN (ES-GNN) framework to adaptively distinguish between graph edges either relevant or irrelevant to learning tasks.
We show that our ES-GNN can be regarded as a solution to a disentangled graph denoising problem.
arXiv Detail & Related papers (2022-05-27T01:29:03Z) - Learning heterophilious edge to drop: A general framework for boosting
graph neural networks [19.004710957882402]
This work aims at mitigating the negative impacts of heterophily by optimizing graph structure for the first time.
We propose a structure learning method called LHE to identify heterophilious edges to drop.
Experiments demonstrate the remarkable performance improvement of GNNs with emphLHE on multiple datasets across full spectrum of homophily level.
arXiv Detail & Related papers (2022-05-23T14:07:29Z) - Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node
Classification? [44.71818395535755]
Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using the graph structures based on the inductive bias (homophily assumption)
Performance advantages of GNNs over graph-agnostic NNs seem not generally satisfactory.
Heterophily has been considered as a main cause and numerous works have been put forward to address it.
arXiv Detail & Related papers (2021-09-12T23:57:05Z) - Is Homophily a Necessity for Graph Neural Networks? [50.959340355849896]
Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
arXiv Detail & Related papers (2021-06-11T02:44:00Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.