Make Heterophily Graphs Better Fit GNN: A Graph Rewiring Approach
- URL: http://arxiv.org/abs/2209.08264v1
- Date: Sat, 17 Sep 2022 06:55:21 GMT
- Title: Make Heterophily Graphs Better Fit GNN: A Graph Rewiring Approach
- Authors: Wendong Bi, Lun Du, Qiang Fu, Yanlin Wang, Shi Han, Dongmei Zhang
- Abstract summary: We propose a method named Deep Heterophily Graph Rewiring (DHGR) to rewire graphs by adding homophilic edges and pruning heterophilic edges.
To the best of our knowledge, it is the first work studying graph rewiring for heterophily graphs.
- Score: 43.41163711340362
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) are popular machine learning methods for
modeling graph data. A lot of GNNs perform well on homophily graphs while
having unsatisfactory performance on heterophily graphs. Recently, some
researchers turn their attention to designing GNNs for heterophily graphs by
adjusting the message passing mechanism or enlarging the receptive field of the
message passing. Different from existing works that mitigate the issues of
heterophily from model design perspective, we propose to study heterophily
graphs from an orthogonal perspective by rewiring the graph structure to reduce
heterophily and making the traditional GNNs perform better. Through
comprehensive empirical studies and analysis, we verify the potential of the
rewiring methods. To fully exploit its potential, we propose a method named
Deep Heterophily Graph Rewiring (DHGR) to rewire graphs by adding homophilic
edges and pruning heterophilic edges. The detailed way of rewiring is
determined by comparing the similarity of label/feature-distribution of node
neighbors. Besides, we design a scalable implementation for DHGR to guarantee
high efficiency. DHRG can be easily used as a plug-in module, i.e., a graph
pre-processing step, for any GNNs, including both GNN for homophily and
heterophily, to boost their performance on the node classification task. To the
best of our knowledge, it is the first work studying graph rewiring for
heterophily graphs. Extensive experiments on 11 public graph datasets
demonstrate the superiority of our proposed methods.
Related papers
- Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - GRAPES: Learning to Sample Graphs for Scalable Graph Neural Networks [2.4175455407547015]
Graph neural networks learn to represent nodes by aggregating information from their neighbors.
Several existing methods address this by sampling a small subset of nodes, scaling GNNs to much larger graphs.
We introduce GRAPES, an adaptive sampling method that learns to identify the set of nodes crucial for training a GNN.
arXiv Detail & Related papers (2023-10-05T09:08:47Z) - GCNH: A Simple Method For Representation Learning On Heterophilous
Graphs [4.051099980410583]
Graph Neural Networks (GNNs) are well-suited for learning on homophilous graphs.
Recent works have proposed extensions to standard GNN architectures to improve performance on heterophilous graphs.
We propose GCN for Heterophily (GCNH), a simple yet effective GNN architecture applicable to both heterophilous and homophilous scenarios.
arXiv Detail & Related papers (2023-04-21T11:26:24Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Is Homophily a Necessity for Graph Neural Networks? [50.959340355849896]
Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
arXiv Detail & Related papers (2021-06-11T02:44:00Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z) - Meta-path Free Semi-supervised Learning for Heterogeneous Networks [16.641434334366227]
Graph neural networks (GNNs) have been widely used in representation learning on graphs and achieved superior performance in tasks such as node classification.
In this paper, we propose simple and effective graph neural networks for heterogeneous graph, excluding the use of meta-paths.
arXiv Detail & Related papers (2020-10-18T06:01:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.