Finding Global Homophily in Graph Neural Networks When Meeting
Heterophily
- URL: http://arxiv.org/abs/2205.07308v1
- Date: Sun, 15 May 2022 15:24:26 GMT
- Title: Finding Global Homophily in Graph Neural Networks When Meeting
Heterophily
- Authors: Xiang Li, Renyu Zhu, Yao Cheng, Caihua Shan, Siqiang Luo, Dongsheng
Li, Weining Qian
- Abstract summary: Some existing methods amplify a node's neighborhood with multi-hop neighbors to include more nodes with homophily.
We propose two models GloGNN and GloGNN++, which generate a node's embedding by aggregating information from global nodes in the graph.
We conduct extensive experiments to compare our models against 11 other competitors on 15 benchmark datasets in a wide range of domains, scales and graph heterophilies.
- Score: 27.93157736977356
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We investigate graph neural networks on graphs with heterophily. Some
existing methods amplify a node's neighborhood with multi-hop neighbors to
include more nodes with homophily. However, it is a significant challenge to
set personalized neighborhood sizes for different nodes. Further, for other
homophilous nodes excluded in the neighborhood, they are ignored for
information aggregation. To address these problems, we propose two models
GloGNN and GloGNN++, which generate a node's embedding by aggregating
information from global nodes in the graph. In each layer, both models learn a
coefficient matrix to capture the correlations between nodes, based on which
neighborhood aggregation is performed. The coefficient matrix allows signed
values and is derived from an optimization problem that has a closed-form
solution. We further accelerate neighborhood aggregation and derive a linear
time complexity. We theoretically explain the models' effectiveness by proving
that both the coefficient matrix and the generated node embedding matrix have
the desired grouping effect. We conduct extensive experiments to compare our
models against 11 other competitors on 15 benchmark datasets in a wide range of
domains, scales and graph heterophilies. Experimental results show that our
methods achieve superior performance and are also very efficient.
Related papers
- The Heterophilic Snowflake Hypothesis: Training and Empowering GNNs for Heterophilic Graphs [59.03660013787925]
We introduce the Heterophily Snowflake Hypothesis and provide an effective solution to guide and facilitate research on heterophilic graphs.
Our observations show that our framework acts as a versatile operator for diverse tasks.
It can be integrated into various GNN frameworks, boosting performance in-depth and offering an explainable approach to choosing the optimal network depth.
arXiv Detail & Related papers (2024-06-18T12:16:00Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Addressing Heterophily in Node Classification with Graph Echo State
Networks [11.52174067809364]
We address the challenges of heterophilic graphs with Graph Echo State Network (GESN) for node classification.
GESN is a reservoir computing model for graphs, where node embeddings are computed by an untrained message-passing function.
Our experiments show that reservoir models are able to achieve better or comparable accuracy with respect to most fully trained deep models.
arXiv Detail & Related papers (2023-05-14T19:42:31Z) - LSGNN: Towards General Graph Neural Network in Node Classification by
Local Similarity [59.41119013018377]
We propose to use the local similarity (LocalSim) to learn node-level weighted fusion, which can also serve as a plug-and-play module.
For better fusion, we propose a novel and efficient Initial Residual Difference Connection (IRDC) to extract more informative multi-hop information.
Our proposed method, namely Local Similarity Graph Neural Network (LSGNN), can offer comparable or superior state-of-the-art performance on both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2023-05-07T09:06:11Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [5.431036185361236]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - GBK-GNN: Gated Bi-Kernel Graph Neural Networks for Modeling Both
Homophily and Heterophily [24.742449127169586]
Graph Neural Networks (GNNs) are widely used on a variety of graph-based machine learning tasks.
For node-level tasks, GNNs have strong power to model the homophily property of graphs.
We propose a novel GNN model based on a bi- kernel feature transformation and a selection gate.
arXiv Detail & Related papers (2021-10-29T13:44:09Z) - Graph Pointer Neural Networks [11.656981519694218]
We present Graph Pointer Neural Networks (GPNN) to tackle the challenges mentioned above.
We leverage a pointer network to select the most relevant nodes from a large amount of multi-hop neighborhoods.
The GPNN significantly improves the classification performance of state-of-the-art methods.
arXiv Detail & Related papers (2021-10-03T10:18:25Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.