Neighborhood Homophily-based Graph Convolutional Network
- URL: http://arxiv.org/abs/2301.09851v3
- Date: Fri, 6 Oct 2023 13:45:58 GMT
- Title: Neighborhood Homophily-based Graph Convolutional Network
- Authors: Shengbo Gong, Jiajun Zhou, Chenxuan Xie, Qi Xuan
- Abstract summary: Graph neural networks (GNNs) have been proved powerful in graph-oriented tasks.
Many real-world graphs are heterophilous, challenging the homophily assumption of classical GNNs.
Recent studies propose new metrics to characterize the homophily, but rarely consider the correlation of the proposed metrics and models.
In this paper, we first design a new metric, Neighborhood Homophily (textitNH), to measure the label complexity or purity in node neighborhoods.
- Score: 4.511171093050241
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have been proved powerful in graph-oriented
tasks. However, many real-world graphs are heterophilous, challenging the
homophily assumption of classical GNNs. To solve the universality problem, many
studies deepen networks or concatenate intermediate representations, which does
not inherently change neighbor aggregation and introduces noise. Recent studies
propose new metrics to characterize the homophily, but rarely consider the
correlation of the proposed metrics and models. In this paper, we first design
a new metric, Neighborhood Homophily (\textit{NH}), to measure the label
complexity or purity in node neighborhoods. Furthermore, we incorporate the
metric into the classical graph convolutional network (GCN) architecture and
propose \textbf{N}eighborhood \textbf{H}omophily-based \textbf{G}raph
\textbf{C}onvolutional \textbf{N}etwork (\textbf{NHGCN}). In this framework,
neighbors are grouped by estimated \textit{NH} values and aggregated from
different channels, and the resulting node predictions are then used in turn to
estimate and update \textit{NH} values. The two processes of metric estimation
and model inference are alternately optimized to achieve better node
classification. NHGCN achieves top overall performance on both homophilous and
heterophilous benchmarks, with an improvement of up to 7.4\% compared to the
current SOTA methods.
Related papers
- Chasing Fairness in Graphs: A GNN Architecture Perspective [73.43111851492593]
We propose textsfFair textsfMessage textsfPassing (FMP) designed within a unified optimization framework for graph neural networks (GNNs)
In FMP, the aggregation is first adopted to utilize neighbors' information and then the bias mitigation step explicitly pushes demographic group node presentation centers together.
Experiments on node classification tasks demonstrate that the proposed FMP outperforms several baselines in terms of fairness and accuracy on three real-world datasets.
arXiv Detail & Related papers (2023-12-19T18:00:15Z) - 2-hop Neighbor Class Similarity (2NCS): A graph structural metric
indicative of graph neural network performance [4.051099980410583]
Graph Neural Networks (GNNs) achieve state-of-the-art performance on graph-structured data across numerous domains.
On heterophilous graphs, in which different-type nodes are likely connected, GNNs perform less consistently.
We introduce 2-hop Neighbor Class Similarity (2NCS), a new quantitative graph structural property that correlates with GNN performance more strongly and consistently than alternative metrics.
arXiv Detail & Related papers (2022-12-26T16:16:51Z) - Revisiting Heterophily For Graph Neural Networks [42.41238892727136]
Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using graph structures based on the relational inductive bias (homophily assumption)
Recent work has identified a non-trivial set of datasets where their performance compared to NNs is not satisfactory.
arXiv Detail & Related papers (2022-10-14T08:00:26Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - RAW-GNN: RAndom Walk Aggregation based Graph Neural Network [48.139599737263445]
We introduce a novel aggregation mechanism and develop a RAndom Walk Aggregation-based Graph Neural Network (called RAW-GNN) method.
The new method utilizes breadth-first random walk search to capture homophily information and depth-first search to collect heterophily information.
It replaces the conventional neighborhoods with path-based neighborhoods and introduces a new path-based aggregator based on Recurrent Neural Networks.
arXiv Detail & Related papers (2022-06-28T12:19:01Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Is Homophily a Necessity for Graph Neural Networks? [50.959340355849896]
Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
arXiv Detail & Related papers (2021-06-11T02:44:00Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.