A Survey on Oversmoothing in Graph Neural Networks
- URL: http://arxiv.org/abs/2303.10993v1
- Date: Mon, 20 Mar 2023 10:21:29 GMT
- Title: A Survey on Oversmoothing in Graph Neural Networks
- Authors: T. Konstantin Rusch, Michael M. Bronstein, Siddhartha Mishra
- Abstract summary: Node features of graph neural networks (GNNs) tend to become more similar with the increase of the network depth.
We axiomatically define as the exponential convergence of suitable similarity measures on the node features.
We empirically demonstrate this behavior for several over-smoothing measures on different graphs.
We extend our definition of over-smoothing to the rapidly emerging field of continuous-time GNNs.
- Score: 27.898197360068146
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Node features of graph neural networks (GNNs) tend to become more similar
with the increase of the network depth. This effect is known as over-smoothing,
which we axiomatically define as the exponential convergence of suitable
similarity measures on the node features. Our definition unifies previous
approaches and gives rise to new quantitative measures of over-smoothing.
Moreover, we empirically demonstrate this behavior for several over-smoothing
measures on different graphs (small-, medium-, and large-scale). We also review
several approaches for mitigating over-smoothing and empirically test their
effectiveness on real-world graph datasets. Through illustrative examples, we
demonstrate that mitigating over-smoothing is a necessary but not sufficient
condition for building deep GNNs that are expressive on a wide range of graph
learning tasks. Finally, we extend our definition of over-smoothing to the
rapidly emerging field of continuous-time GNNs.
Related papers
- Residual connections provably mitigate oversmoothing in graph neural networks [33.548465692402765]
Graph neural networks (GNNs) have achieved remarkable empirical success in processing and representing graph-structured data.
However, a significant challenge known as "oversmoothing" persists, where expressive features become nearly indistinguishable in deep GNNs.
In this work, we analyze the oversmoothing rates of deep GNNs with and without residual connections.
arXiv Detail & Related papers (2025-01-01T07:35:36Z) - What functions can Graph Neural Networks compute on random graphs? The
role of Positional Encoding [0.0]
We aim to deepen the theoretical understanding of Graph Neural Networks (GNNs) on large graphs, with a focus on their expressive power.
Recently, several works showed that, on very general random graphs models, GNNs converge to certains functions as the number of nodes grows.
arXiv Detail & Related papers (2023-05-24T07:09:53Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Meta Graph Attention on Heterogeneous Graph with Node-Edge Co-evolution [44.90253939019069]
We present Coevolved Meta Graph Neural Network (CoMGNN), which applies meta graph attention to heterogeneous graphs with co-evolution of edges and node states.
We also propose CoMGNN (ST-CoMGNN) for modelingtemporal patterns on nodes and edges.
arXiv Detail & Related papers (2020-10-09T13:19:39Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.