Nested Graph Neural Networks
- URL: http://arxiv.org/abs/2110.13197v1
- Date: Mon, 25 Oct 2021 18:22:24 GMT
- Title: Nested Graph Neural Networks
- Authors: Muhan Zhang, Pan Li
- Abstract summary: Graph neural network (GNN)'s success in graph classification is closely related to the Weisfeiler-Lehman (1-WL) algorithm.
We propose Nested Graph Neural Networks (NGNNs) to represent a graph with rooted subgraphs instead of rooted subtrees.
- Score: 20.28732862748783
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural network (GNN)'s success in graph classification is closely
related to the Weisfeiler-Lehman (1-WL) algorithm. By iteratively aggregating
neighboring node features to a center node, both 1-WL and GNN obtain a node
representation that encodes a rooted subtree around the center node. These
rooted subtree representations are then pooled into a single representation to
represent the whole graph. However, rooted subtrees are of limited
expressiveness to represent a non-tree graph. To address it, we propose Nested
Graph Neural Networks (NGNNs). NGNN represents a graph with rooted subgraphs
instead of rooted subtrees, so that two graphs sharing many identical subgraphs
(rather than subtrees) tend to have similar representations. The key is to make
each node representation encode a subgraph around it more than a subtree. To
achieve this, NGNN extracts a local subgraph around each node and applies a
base GNN to each subgraph to learn a subgraph representation. The whole-graph
representation is then obtained by pooling these subgraph representations. We
provide a rigorous theoretical analysis showing that NGNN is strictly more
powerful than 1-WL. In particular, we proved that NGNN can discriminate almost
all r-regular graphs, where 1-WL always fails. Moreover, unlike other more
powerful GNNs, NGNN only introduces a constant-factor higher time complexity
than standard GNNs. NGNN is a plug-and-play framework that can be combined with
various base GNNs. We test NGNN with different base GNNs on several benchmark
datasets. NGNN uniformly improves their performance and shows highly
competitive performance on all datasets.
Related papers
- Boosting the Cycle Counting Power of Graph Neural Networks with
I$^2$-GNNs [9.806910643086043]
We propose I$2$-GNNs to extend Subgraph MPNNs by assigning different identifiers for the root node and its neighbors in each subgraph.
I$2$-GNNs' discriminative power is shown to be strictly stronger than Subgraph MPNNs and partially stronger than the 3-WL test.
To the best of our knowledge, it is the first linear-time GNN model that can count 6-cycles with theoretical guarantees.
arXiv Detail & Related papers (2022-10-22T09:40:00Z) - Geodesic Graph Neural Network for Efficient Graph Representation
Learning [34.047527874184134]
We propose an efficient GNN framework called Geodesic GNN (GDGNN)
It injects conditional relationships between nodes into the model without labeling.
Conditioned on the geodesic representations, GDGNN is able to generate node, link, and graph representations that carry much richer structural information than plain GNNs.
arXiv Detail & Related papers (2022-10-06T02:02:35Z) - From Stars to Subgraphs: Uplifting Any GNN with Local Structure
Awareness [23.279464786779787]
We introduce a general framework to uplift any MPNN to be more expressive.
Our framework is strictly more powerful than 1&2-WL, and is not less powerful than 3-WL.
Our method sets new state-of-the-art performance by large margins for several well-known graph ML tasks.
arXiv Detail & Related papers (2021-10-07T19:08:08Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Graphon Neural Networks and the Transferability of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) rely on graph convolutions to extract local features from network data.
We introduce graphon NNs as limit objects of GNNs and prove a bound on the difference between the output of a GNN and its limit graphon-NN.
This result establishes a tradeoff between discriminability and transferability of GNNs.
arXiv Detail & Related papers (2020-06-05T16:41:08Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.