Towards Deeper Graph Neural Networks with Differentiable Group
Normalization
- URL: http://arxiv.org/abs/2006.06972v1
- Date: Fri, 12 Jun 2020 07:18:02 GMT
- Title: Towards Deeper Graph Neural Networks with Differentiable Group
Normalization
- Authors: Kaixiong Zhou, Xiao Huang, Yuening Li, Daochen Zha, Rui Chen, Xia Hu
- Abstract summary: Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
- Score: 61.20639338417576
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs), which learn the representation of a node by
aggregating its neighbors, have become an effective computational tool in
downstream applications. Over-smoothing is one of the key issues which limit
the performance of GNNs as the number of layers increases. It is because the
stacked aggregators would make node representations converge to
indistinguishable vectors. Several attempts have been made to tackle the issue
by bringing linked node pairs close and unlinked pairs distinct. However, they
often ignore the intrinsic community structures and would result in sub-optimal
performance. The representations of nodes within the same community/class need
be similar to facilitate the classification, while different classes are
expected to be separated in embedding space. To bridge the gap, we introduce
two over-smoothing metrics and a novel technique, i.e., differentiable group
normalization (DGN). It normalizes nodes within the same group independently to
increase their smoothness, and separates node distributions among different
groups to significantly alleviate the over-smoothing issue. Experiments on
real-world datasets demonstrate that DGN makes GNN models more robust to
over-smoothing and achieves better performance with deeper GNNs.
Related papers
- Learning Personalized Scoping for Graph Neural Networks under Heterophily [3.475704621679017]
Heterophilous graphs, where dissimilar nodes tend to connect, pose a challenge for graph neural networks (GNNs)
We formalize personalized scoping as a separate scope classification problem that overcomes GNN overfitting in node classification.
We propose Adaptive Scope (AS), a lightweight approach that only participates in GNN inference.
arXiv Detail & Related papers (2024-09-11T04:13:39Z) - The Heterophilic Snowflake Hypothesis: Training and Empowering GNNs for Heterophilic Graphs [59.03660013787925]
We introduce the Heterophily Snowflake Hypothesis and provide an effective solution to guide and facilitate research on heterophilic graphs.
Our observations show that our framework acts as a versatile operator for diverse tasks.
It can be integrated into various GNN frameworks, boosting performance in-depth and offering an explainable approach to choosing the optimal network depth.
arXiv Detail & Related papers (2024-06-18T12:16:00Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Clarify Confused Nodes via Separated Learning [4.282496716373314]
Graph neural networks (GNNs) have achieved remarkable advances in graph-oriented tasks.
Real-world graphs invariably contain a certain proportion of heterophilous nodes, challenging the homophily assumption of traditional GNNs.
We propose a new metric, termed Neighborhood Confusion (NC), to facilitate a more reliable separation of nodes.
arXiv Detail & Related papers (2023-06-04T07:26:20Z) - LSGNN: Towards General Graph Neural Network in Node Classification by
Local Similarity [59.41119013018377]
We propose to use the local similarity (LocalSim) to learn node-level weighted fusion, which can also serve as a plug-and-play module.
For better fusion, we propose a novel and efficient Initial Residual Difference Connection (IRDC) to extract more informative multi-hop information.
Our proposed method, namely Local Similarity Graph Neural Network (LSGNN), can offer comparable or superior state-of-the-art performance on both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2023-05-07T09:06:11Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Graph Pointer Neural Networks [11.656981519694218]
We present Graph Pointer Neural Networks (GPNN) to tackle the challenges mentioned above.
We leverage a pointer network to select the most relevant nodes from a large amount of multi-hop neighborhoods.
The GPNN significantly improves the classification performance of state-of-the-art methods.
arXiv Detail & Related papers (2021-10-03T10:18:25Z) - Position-based Hash Embeddings For Scaling Graph Neural Networks [8.87527266373087]
Graph Neural Networks (GNNs) compute node representations by taking into account the topology of the node's ego-network and the features of the ego-network's nodes.
When the nodes do not have high-quality features, GNNs learn an embedding layer to compute node embeddings and use them as input features.
To reduce the memory associated with this embedding layer, hashing-based approaches, commonly used in applications like NLP and recommender systems, can potentially be used.
We present approaches that take advantage of the nodes' position in the graph to dramatically reduce the memory required.
arXiv Detail & Related papers (2021-08-31T22:42:25Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.