Understanding over-squashing and bottlenecks on graphs via curvature
- URL: http://arxiv.org/abs/2111.14522v1
- Date: Mon, 29 Nov 2021 13:27:56 GMT
- Title: Understanding over-squashing and bottlenecks on graphs via curvature
- Authors: Jake Topping, Francesco Di Giovanni, Benjamin Paul Chamberlain,
Xiaowen Dong, Michael M. Bronstein
- Abstract summary: Over-squashing is a phenomenon where the number of $k$-hop neighbors grows rapidly with $k$.
We introduce a new edge-based curvature and prove that negatively curved edges are responsible for over-squashing.
We also propose and experimentally test a curvature-based rewiring method to alleviate the over-squashing.
- Score: 17.359098638324546
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most graph neural networks (GNNs) use the message passing paradigm, in which
node features are propagated on the input graph. Recent works pointed to the
distortion of information flowing from distant nodes as a factor limiting the
efficiency of message passing for tasks relying on long-distance interactions.
This phenomenon, referred to as 'over-squashing', has been heuristically
attributed to graph bottlenecks where the number of $k$-hop neighbors grows
rapidly with $k$. We provide a precise description of the over-squashing
phenomenon in GNNs and analyze how it arises from bottlenecks in the graph. For
this purpose, we introduce a new edge-based combinatorial curvature and prove
that negatively curved edges are responsible for the over-squashing issue. We
also propose and experimentally test a curvature-based graph rewiring method to
alleviate the over-squashing.
Related papers
- The Effectiveness of Curvature-Based Rewiring and the Role of Hyperparameters in GNNs Revisited [0.7373617024876725]
Message passing is the dominant paradigm in Graph Neural Networks (GNNs)
Recent efforts have focused on graph rewiring techniques, which disconnect the input graph originating from the data and the computational graph, on which message passing is performed.
While oversquashing has been demonstrated in synthetic datasets, in this work we reevaluate the performance gains that curvature-based rewiring brings to real-world datasets.
arXiv Detail & Related papers (2024-07-12T16:03:58Z) - Over-Squashing in Riemannian Graph Neural Networks [1.6317061277457001]
Most graph neural networks (GNNs) are prone to the phenomenon of over-squashing.
Recent works have shown that the topology of the graph has the greatest impact on over-squashing.
We explore whether over-squashing can be mitigated through the embedding space of the GNN.
arXiv Detail & Related papers (2023-11-27T15:51:07Z) - BOURNE: Bootstrapped Self-supervised Learning Framework for Unified
Graph Anomaly Detection [50.26074811655596]
We propose a novel unified graph anomaly detection framework based on bootstrapped self-supervised learning (named BOURNE)
By swapping the context embeddings between nodes and edges, we enable the mutual detection of node and edge anomalies.
BOURNE can eliminate the need for negative sampling, thereby enhancing its efficiency in handling large graphs.
arXiv Detail & Related papers (2023-07-28T00:44:57Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Rethinking Explaining Graph Neural Networks via Non-parametric Subgraph
Matching [68.35685422301613]
We propose a novel non-parametric subgraph matching framework, dubbed MatchExplainer, to explore explanatory subgraphs.
It couples the target graph with other counterpart instances and identifies the most crucial joint substructure by minimizing the node corresponding-based distance.
Experiments on synthetic and real-world datasets show the effectiveness of our MatchExplainer by outperforming all state-of-the-art parametric baselines with significant margins.
arXiv Detail & Related papers (2023-01-07T05:14:45Z) - Revisiting Over-smoothing and Over-squashing Using Ollivier-Ricci
Curvature [11.592567773739411]
Our study reveals the key connection between the local graph geometry and the occurrence of over-smoothing and over-squashing.
We propose the Batch Ollivier-Ricci Flow, a novel rewiring algorithm capable of simultaneously addressing both over-smoothing and over-squashing.
arXiv Detail & Related papers (2022-11-28T21:21:31Z) - FoSR: First-order spectral rewiring for addressing oversquashing in GNNs [0.0]
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph.
We propose a computationally efficient algorithm that prevents oversquashing by systematically adding edges to the graph.
We find experimentally that our algorithm outperforms existing graph rewiring methods in several graph classification tasks.
arXiv Detail & Related papers (2022-10-21T07:58:03Z) - Oversquashing in GNNs through the lens of information contraction and
graph expansion [6.8222473597904845]
We present a framework for analyzing oversquashing based on information contraction.
We propose a graph rewiring algorithm aimed at alleviating oversquashing.
arXiv Detail & Related papers (2022-08-06T08:44:39Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.