Is Homophily a Necessity for Graph Neural Networks?
- URL: http://arxiv.org/abs/2106.06134v4
- Date: Fri, 21 Jul 2023 05:02:21 GMT
- Title: Is Homophily a Necessity for Graph Neural Networks?
- Authors: Yao Ma, Xiaorui Liu, Neil Shah, Jiliang Tang
- Abstract summary: Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
- Score: 50.959340355849896
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) have shown great prowess in learning
representations suitable for numerous graph-based machine learning tasks. When
applied to semi-supervised node classification, GNNs are widely believed to
work well due to the homophily assumption ("like attracts like"), and fail to
generalize to heterophilous graphs where dissimilar nodes connect. Recent works
design new architectures to overcome such heterophily-related limitations,
citing poor baseline performance and new architecture improvements on a few
heterophilous graph benchmark datasets as evidence for this notion. In our
experiments, we empirically find that standard graph convolutional networks
(GCNs) can actually achieve better performance than such carefully designed
methods on some commonly used heterophilous graphs. This motivates us to
reconsider whether homophily is truly necessary for good GNN performance. We
find that this claim is not quite true, and in fact, GCNs can achieve strong
performance on heterophilous graphs under certain conditions. Our work
carefully characterizes these conditions, and provides supporting theoretical
understanding and empirical observations. Finally, we examine existing
heterophilous graphs benchmarks and reconcile how the GCN (under)performs on
them based on this understanding.
Related papers
- The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Breaking the Entanglement of Homophily and Heterophily in
Semi-supervised Node Classification [25.831508778029097]
We introduce AMUD, which quantifies the relationship between node profiles and topology from a statistical perspective.
We also propose ADPA as a new directed graph learning paradigm for AMUD.
arXiv Detail & Related papers (2023-12-07T07:54:11Z) - Demystifying Structural Disparity in Graph Neural Networks: Can One Size
Fit All? [61.35457647107439]
Most real-world homophilic and heterophilic graphs are comprised of a mixture of nodes in both homophilic and heterophilic structural patterns.
We provide evidence that Graph Neural Networks(GNNs) on node classification typically perform admirably on homophilic nodes.
We then propose a rigorous, non-i.i.d PAC-Bayesian generalization bound for GNNs, revealing reasons for the performance disparity.
arXiv Detail & Related papers (2023-06-02T07:46:20Z) - GCNH: A Simple Method For Representation Learning On Heterophilous
Graphs [4.051099980410583]
Graph Neural Networks (GNNs) are well-suited for learning on homophilous graphs.
Recent works have proposed extensions to standard GNN architectures to improve performance on heterophilous graphs.
We propose GCN for Heterophily (GCNH), a simple yet effective GNN architecture applicable to both heterophilous and homophilous scenarios.
arXiv Detail & Related papers (2023-04-21T11:26:24Z) - A critical look at the evaluation of GNNs under heterophily: Are we
really making progress? [39.62589602648429]
It is often believed that standard Graph Neural Networks (GNNs) only work well for homophilous graphs.
We show that standard datasets used for evaluating heterophily-specific models have serious drawbacks.
We propose a set of heterophilous graphs of varying properties that we believe can serve as a better benchmark for evaluating the performance of GNNs under heterophily.
arXiv Detail & Related papers (2023-02-22T20:32:59Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z) - Graph Neural Networks with Heterophily [40.23690407583509]
We propose a novel framework called CPGNN that generalizes GNNs for graphs with either homophily or heterophily.
We show that replacing the compatibility matrix in our framework with the identity (which represents pure homophily) reduces to GCN.
arXiv Detail & Related papers (2020-09-28T18:29:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.