Boosting the Cycle Counting Power of Graph Neural Networks with
I$^2$-GNNs
- URL: http://arxiv.org/abs/2210.13978v3
- Date: Mon, 8 May 2023 20:45:45 GMT
- Title: Boosting the Cycle Counting Power of Graph Neural Networks with
I$^2$-GNNs
- Authors: Yinan Huang, Xingang Peng, Jianzhu Ma, Muhan Zhang
- Abstract summary: We propose I$2$-GNNs to extend Subgraph MPNNs by assigning different identifiers for the root node and its neighbors in each subgraph.
I$2$-GNNs' discriminative power is shown to be strictly stronger than Subgraph MPNNs and partially stronger than the 3-WL test.
To the best of our knowledge, it is the first linear-time GNN model that can count 6-cycles with theoretical guarantees.
- Score: 9.806910643086043
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Message Passing Neural Networks (MPNNs) are a widely used class of Graph
Neural Networks (GNNs). The limited representational power of MPNNs inspires
the study of provably powerful GNN architectures. However, knowing one model is
more powerful than another gives little insight about what functions they can
or cannot express. It is still unclear whether these models are able to
approximate specific functions such as counting certain graph substructures,
which is essential for applications in biology, chemistry and social network
analysis. Motivated by this, we propose to study the counting power of Subgraph
MPNNs, a recent and popular class of powerful GNN models that extract rooted
subgraphs for each node, assign the root node a unique identifier and encode
the root node's representation within its rooted subgraph. Specifically, we
prove that Subgraph MPNNs fail to count more-than-4-cycles at node level,
implying that node representations cannot correctly encode the surrounding
substructures like ring systems with more than four atoms. To overcome this
limitation, we propose I$^2$-GNNs to extend Subgraph MPNNs by assigning
different identifiers for the root node and its neighbors in each subgraph.
I$^2$-GNNs' discriminative power is shown to be strictly stronger than Subgraph
MPNNs and partially stronger than the 3-WL test. More importantly, I$^2$-GNNs
are proven capable of counting all 3, 4, 5 and 6-cycles, covering common
substructures like benzene rings in organic chemistry, while still keeping
linear complexity. To the best of our knowledge, it is the first linear-time
GNN model that can count 6-cycles with theoretical guarantees. We validate its
counting power in cycle counting tasks and demonstrate its competitive
performance in molecular prediction benchmarks.
Related papers
- Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Distance-Restricted Folklore Weisfeiler-Leman GNNs with Provable Cycle
Counting Power [27.929167547127207]
Ability of graph neural networks (GNNs) to count certain graph substructures, especially cycles, is important.
We propose a novel class of GNNs -- $d$-Distance-Restricted FWL(2) GNNs.
Our model is the most efficient GNN model to date that can count up to 6-cycles.
arXiv Detail & Related papers (2023-09-10T06:13:29Z) - An Efficient Subgraph GNN with Provable Substructure Counting Power [32.44487589958533]
We investigate the enhancement of graph neural networks' (GNNs) representation power through their ability in substructure counting.
Recent advances have seen the adoption of subgraph GNNs, which partition an input graph into numerous subgraphs, subsequently applying GNNs to each to augment the graph's overall representation.
Despite their ability to identify various substructures, subgraph GNNs are hindered by significant computational and memory costs.
arXiv Detail & Related papers (2023-03-19T05:35:59Z) - Nested Graph Neural Networks [20.28732862748783]
Graph neural network (GNN)'s success in graph classification is closely related to the Weisfeiler-Lehman (1-WL) algorithm.
We propose Nested Graph Neural Networks (NGNNs) to represent a graph with rooted subgraphs instead of rooted subtrees.
arXiv Detail & Related papers (2021-10-25T18:22:24Z) - Ego-GNNs: Exploiting Ego Structures in Graph Neural Networks [12.97622530614215]
We show that Ego-GNNs are capable of recognizing closed triangles, which is essential given the prominence of transitivity in real-world graphs.
In particular, we show that Ego-GNNs are capable of recognizing closed triangles, which is essential given the prominence of transitivity in real-world graphs.
arXiv Detail & Related papers (2021-07-22T23:42:23Z) - Theoretically Improving Graph Neural Networks via Anonymous Walk Graph
Kernels [25.736529232578178]
Graph neural networks (GNNs) have achieved tremendous success in graph mining.
MPGNNs, as the prevailing type of GNNs, have been theoretically shown unable to distinguish, detect or count many graph substructures.
We propose GSKN, a GNN model with a theoretically stronger ability to distinguish graph structures.
arXiv Detail & Related papers (2021-04-07T08:50:34Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z) - Can Graph Neural Networks Count Substructures? [53.256112515435355]
We study the power of graph neural networks (GNNs) via their ability to count attributed graph substructures.
We distinguish between two types of substructure counting: inducedsubgraph-count and subgraphcount-count, and both positive and negative answers for popular GNN architectures.
arXiv Detail & Related papers (2020-02-10T18:53:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.