Oversquashing in GNNs through the lens of information contraction and
graph expansion
- URL: http://arxiv.org/abs/2208.03471v1
- Date: Sat, 6 Aug 2022 08:44:39 GMT
- Title: Oversquashing in GNNs through the lens of information contraction and
graph expansion
- Authors: Pradeep Kr. Banerjee, Kedar Karhadkar, Yu Guang Wang, Uri Alon, Guido
Mont\'ufar
- Abstract summary: We present a framework for analyzing oversquashing based on information contraction.
We propose a graph rewiring algorithm aimed at alleviating oversquashing.
- Score: 6.8222473597904845
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The quality of signal propagation in message-passing graph neural networks
(GNNs) strongly influences their expressivity as has been observed in recent
works. In particular, for prediction tasks relying on long-range interactions,
recursive aggregation of node features can lead to an undesired phenomenon
called "oversquashing". We present a framework for analyzing oversquashing
based on information contraction. Our analysis is guided by a model of reliable
computation due to von Neumann that lends a new insight into oversquashing as
signal quenching in noisy computation graphs. Building on this, we propose a
graph rewiring algorithm aimed at alleviating oversquashing. Our algorithm
employs a random local edge flip primitive motivated by an expander graph
construction. We compare the spectral expansion properties of our algorithm
with that of an existing curvature-based non-local rewiring strategy. Synthetic
experiments show that while our algorithm in general has a slower rate of
expansion, it is overall computationally cheaper, preserves the node degrees
exactly and never disconnects the graph.
Related papers
- Over-Squashing in Riemannian Graph Neural Networks [1.6317061277457001]
Most graph neural networks (GNNs) are prone to the phenomenon of over-squashing.
Recent works have shown that the topology of the graph has the greatest impact on over-squashing.
We explore whether over-squashing can be mitigated through the embedding space of the GNN.
arXiv Detail & Related papers (2023-11-27T15:51:07Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - FoSR: First-order spectral rewiring for addressing oversquashing in GNNs [0.0]
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph.
We propose a computationally efficient algorithm that prevents oversquashing by systematically adding edges to the graph.
We find experimentally that our algorithm outperforms existing graph rewiring methods in several graph classification tasks.
arXiv Detail & Related papers (2022-10-21T07:58:03Z) - Expander Graph Propagation [0.0]
We propose an elegant approach based on propagating information over expander graphs.
We show that EGP is able to address all of the above concerns, while requiring minimal effort to set up.
We believe our analysis paves the way to a novel class of scalable methods to counter oversquashing in GNNs.
arXiv Detail & Related papers (2022-10-06T15:36:37Z) - Understanding over-squashing and bottlenecks on graphs via curvature [17.359098638324546]
Over-squashing is a phenomenon where the number of $k$-hop neighbors grows rapidly with $k$.
We introduce a new edge-based curvature and prove that negatively curved edges are responsible for over-squashing.
We also propose and experimentally test a curvature-based rewiring method to alleviate the over-squashing.
arXiv Detail & Related papers (2021-11-29T13:27:56Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z) - Hierarchical Representation Learning in Graph Neural Networks with Node Decimation Pooling [31.812988573924674]
In graph neural networks (GNNs), pooling operators compute local summaries of input graphs to capture their global properties.
We propose the Node Decimation Pooling (NDP), a pooling operator for GNNs that generates coarser graphs while preserving the overall graph topology.
NDP is more efficient compared to state-of-the-art graph pooling operators while reaching, at the same time, competitive performance on a significant variety of graph classification tasks.
arXiv Detail & Related papers (2019-10-24T21:42:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.