LSGNN: Towards General Graph Neural Network in Node Classification by
Local Similarity
- URL: http://arxiv.org/abs/2305.04225v2
- Date: Tue, 20 Jun 2023 11:53:04 GMT
- Title: LSGNN: Towards General Graph Neural Network in Node Classification by
Local Similarity
- Authors: Yuhan Chen, Yihong Luo, Jing Tang, Liang Yang, Siya Qiu, Chuan Wang,
Xiaochun Cao
- Abstract summary: We propose to use the local similarity (LocalSim) to learn node-level weighted fusion, which can also serve as a plug-and-play module.
For better fusion, we propose a novel and efficient Initial Residual Difference Connection (IRDC) to extract more informative multi-hop information.
Our proposed method, namely Local Similarity Graph Neural Network (LSGNN), can offer comparable or superior state-of-the-art performance on both homophilic and heterophilic graphs.
- Score: 59.41119013018377
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterophily has been considered as an issue that hurts the performance of
Graph Neural Networks (GNNs). To address this issue, some existing work uses a
graph-level weighted fusion of the information of multi-hop neighbors to
include more nodes with homophily. However, the heterophily might differ among
nodes, which requires to consider the local topology. Motivated by it, we
propose to use the local similarity (LocalSim) to learn node-level weighted
fusion, which can also serve as a plug-and-play module. For better fusion, we
propose a novel and efficient Initial Residual Difference Connection (IRDC) to
extract more informative multi-hop information. Moreover, we provide
theoretical analysis on the effectiveness of LocalSim representing node
homophily on synthetic graphs. Extensive evaluations over real benchmark
datasets show that our proposed method, namely Local Similarity Graph Neural
Network (LSGNN), can offer comparable or superior state-of-the-art performance
on both homophilic and heterophilic graphs. Meanwhile, the plug-and-play model
can significantly boost the performance of existing GNNs. Our code is provided
at https://github.com/draym28/LSGNN.
Related papers
- Strong Transitivity Relations and Graph Neural Networks [1.19658449368018]
Local neighborhoods play a crucial role in embedding generation in graph-based learning.
We introduce Transitivity Graph Neural Network (TransGNN), which more than local node similarities.
We evaluate our model over several real-world datasets and showed that it considerably improves the performance of several well-known GNN models.
arXiv Detail & Related papers (2024-01-01T13:53:50Z) - NDGGNET-A Node Independent Gate based Graph Neural Networks [6.155450481110693]
For nodes with sparse connectivity, it is difficult to obtain enough information through a single GNN layer.
In this thesis, we define a novel framework that allows the normal GNN model to accommodate more layers.
Experimental results show that our proposed model can effectively increase the model depth and perform well on several datasets.
arXiv Detail & Related papers (2022-05-11T08:51:04Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Position-based Hash Embeddings For Scaling Graph Neural Networks [8.87527266373087]
Graph Neural Networks (GNNs) compute node representations by taking into account the topology of the node's ego-network and the features of the ego-network's nodes.
When the nodes do not have high-quality features, GNNs learn an embedding layer to compute node embeddings and use them as input features.
To reduce the memory associated with this embedding layer, hashing-based approaches, commonly used in applications like NLP and recommender systems, can potentially be used.
We present approaches that take advantage of the nodes' position in the graph to dramatically reduce the memory required.
arXiv Detail & Related papers (2021-08-31T22:42:25Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - Node Similarity Preserving Graph Convolutional Networks [51.520749924844054]
Graph Neural Networks (GNNs) explore the graph structure and node features by aggregating and transforming information within node neighborhoods.
We propose SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure.
We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs.
arXiv Detail & Related papers (2020-11-19T04:18:01Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Non-Local Graph Neural Networks [60.28057802327858]
We propose a simple yet effective non-local aggregation framework with an efficient attention-guided sorting for GNNs.
We perform thorough experiments to analyze disassortative graph datasets and evaluate our non-local GNNs.
arXiv Detail & Related papers (2020-05-29T14:50:27Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.