Understanding Oversquashing in GNNs through the Lens of Effective
Resistance
- URL: http://arxiv.org/abs/2302.06835v2
- Date: Tue, 6 Jun 2023 03:57:37 GMT
- Title: Understanding Oversquashing in GNNs through the Lens of Effective
Resistance
- Authors: Mitchell Black and Zhengchao Wan and Amir Nayyeri and Yusu Wang
- Abstract summary: We develop an algorithm to identify edges to be added to an input graph to minimize the total effective resistance, thereby alleviating oversquashing.
We provide empirical evidence of the effectiveness of our total effective resistance based rewiring strategies for improving the performance of GNNs.
- Score: 9.640594614636047
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Message passing graph neural networks (GNNs) are a popular learning
architectures for graph-structured data. However, one problem GNNs experience
is oversquashing, where a GNN has difficulty sending information between
distant nodes. Understanding and mitigating oversquashing has recently received
significant attention from the research community. In this paper, we continue
this line of work by analyzing oversquashing through the lens of the effective
resistance between nodes in the input graph. Effective resistance intuitively
captures the ``strength'' of connection between two nodes by paths in the
graph, and has a rich literature spanning many areas of graph theory. We
propose to use total effective resistance as a bound of the total amount of
oversquashing in a graph and provide theoretical justification for its use. We
further develop an algorithm to identify edges to be added to an input graph to
minimize the total effective resistance, thereby alleviating oversquashing. We
provide empirical evidence of the effectiveness of our total effective
resistance based rewiring strategies for improving the performance of GNNs.
Related papers
- Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - Fast and Effective GNN Training with Linearized Random Spanning Trees [20.73637495151938]
We present a new effective and scalable framework for training GNNs in node classification tasks.
Our approach progressively refines the GNN weights on an extensive sequence of random spanning trees.
The sparse nature of these path graphs substantially lightens the computational burden of GNN training.
arXiv Detail & Related papers (2023-06-07T23:12:42Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Resisting Graph Adversarial Attack via Cooperative Homophilous
Augmentation [60.50994154879244]
Recent studies show that Graph Neural Networks are vulnerable and easily fooled by small perturbations.
In this work, we focus on the emerging but critical attack, namely, Graph Injection Attack.
We propose a general defense framework CHAGNN against GIA through cooperative homophilous augmentation of graph data and model.
arXiv Detail & Related papers (2022-11-15T11:44:31Z) - Anti-Symmetric DGN: a stable architecture for Deep Graph Networks [12.71306369339218]
We present Anti-Symmetric Deep Graph Networks (A-DGNs), a framework for stable and non-dissipative DGN design.
A-DGN yields to improved performance and enables to learn effectively even when dozens of layers are used.
arXiv Detail & Related papers (2022-10-18T12:04:55Z) - Tackling Oversmoothing of GNNs with Contrastive Learning [35.88575306925201]
Graph neural networks (GNNs) integrate the comprehensive relation of graph data and representation learning capability.
Oversmoothing makes the final representations of nodes indiscriminative, thus deteriorating the node classification and link prediction performance.
We propose the Topology-guided Graph Contrastive Layer, named TGCL, which is the first de-oversmoothing method maintaining all three mentioned metrics.
arXiv Detail & Related papers (2021-10-26T15:56:16Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Information Obfuscation of Graph Neural Networks [96.8421624921384]
We study the problem of protecting sensitive attributes by information obfuscation when learning with graph structured data.
We propose a framework to locally filter out pre-determined sensitive attributes via adversarial training with the total variation and the Wasserstein distance.
arXiv Detail & Related papers (2020-09-28T17:55:04Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.