Graph Neural Networks for Graphs with Heterophily: A Survey
- URL: http://arxiv.org/abs/2202.07082v3
- Date: Sun, 25 Feb 2024 01:26:36 GMT
- Title: Graph Neural Networks for Graphs with Heterophily: A Survey
- Authors: Xin Zheng, Yi Wang, Yixin Liu, Ming Li, Miao Zhang, Di Jin, Philip S.
Yu, Shirui Pan
- Abstract summary: We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
- Score: 98.45621222357397
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent years have witnessed fast developments of graph neural networks (GNNs)
that have benefited myriads of graph analytic tasks and applications. In
general, most GNNs depend on the homophily assumption that nodes belonging to
the same class are more likely to be connected. However, as a ubiquitous graph
property in numerous real-world scenarios, heterophily, i.e., nodes with
different labels tend to be linked, significantly limits the performance of
tailor-made homophilic GNNs. Hence, GNNs for heterophilic graphs are gaining
increasing research attention to enhance graph learning with heterophily. In
this paper, we provide a comprehensive review of GNNs for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs
existing heterophilic GNN models, along with a general summary and detailed
analysis. Furthermore, we discuss the correlation between graph heterophily and
various graph research domains, aiming to facilitate the development of more
effective GNNs across a spectrum of practical applications and learning tasks
in the graph research community. In the end, we point out the potential
directions to advance and stimulate more future research and applications on
heterophilic graph learning with GNNs.
Related papers
- The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
Graph Neural Networks (GNNs) combine information from adjacent nodes by successive applications of graph convolutions.
We study the generalization gaps of GNNs on both node-level and graph-level tasks.
We show that the generalization gaps decrease with the number of nodes in the training graphs.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Breaking the Entanglement of Homophily and Heterophily in
Semi-supervised Node Classification [25.831508778029097]
We introduce AMUD, which quantifies the relationship between node profiles and topology from a statistical perspective.
We also propose ADPA as a new directed graph learning paradigm for AMUD.
arXiv Detail & Related papers (2023-12-07T07:54:11Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Beyond Real-world Benchmark Datasets: An Empirical Study of Node
Classification with GNNs [3.547529079746247]
Graph Neural Networks (GNNs) have achieved great success on a node classification task.
Existing evaluation of GNNs lacks fine-grained analysis from various characteristics of graphs.
We conduct extensive experiments with a synthetic graph generator that can generate graphs having controlled characteristics for fine-grained analysis.
arXiv Detail & Related papers (2022-06-18T08:03:12Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Incorporating Heterophily into Graph Neural Networks for Graph Classification [6.709862924279403]
Graph Neural Networks (GNNs) often assume strong homophily for graph classification, seldom considering heterophily.
We develop a novel GNN architecture called IHGNN (short for Incorporating Heterophily into Graph Neural Networks)
We empirically validate IHGNN on various graph datasets and demonstrate that it outperforms the state-of-the-art GNNs for graph classification.
arXiv Detail & Related papers (2022-03-15T06:48:35Z) - Is Homophily a Necessity for Graph Neural Networks? [50.959340355849896]
Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
arXiv Detail & Related papers (2021-06-11T02:44:00Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.