Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting
- URL: http://arxiv.org/abs/2006.09252v3
- Date: Mon, 5 Jul 2021 13:22:05 GMT
- Title: Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting
- Authors: Giorgos Bouritsas, Fabrizio Frasca, Stefanos Zafeiriou, Michael M.
Bronstein
- Abstract summary: "Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
- Score: 63.04999833264299
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While Graph Neural Networks (GNNs) have achieved remarkable results in a
variety of applications, recent studies exposed important shortcomings in their
ability to capture the structure of the underlying graph. It has been shown
that the expressive power of standard GNNs is bounded by the Weisfeiler-Leman
(WL) graph isomorphism test, from which they inherit proven limitations such as
the inability to detect and count graph substructures. On the other hand, there
is significant empirical evidence, e.g. in network science and bioinformatics,
that substructures are often intimately related to downstream tasks. To this
end, we propose "Graph Substructure Networks" (GSN), a topologically-aware
message passing scheme based on substructure encoding. We theoretically analyse
the expressive power of our architecture, showing that it is strictly more
expressive than the WL test, and provide sufficient conditions for
universality. Importantly, we do not attempt to adhere to the WL hierarchy;
this allows us to retain multiple attractive properties of standard GNNs such
as locality and linear network complexity, while being able to disambiguate
even hard instances of graph isomorphism. We perform an extensive experimental
evaluation on graph classification and regression tasks and obtain
state-of-the-art results in diverse real-world settings including molecular
graphs and social networks. The code is publicly available at
https://github.com/gbouritsas/graph-substructure-networks.
Related papers
- SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Uplifting the Expressive Power of Graph Neural Networks through Graph
Partitioning [3.236774847052122]
We study the expressive power of graph neural networks through the lens of graph partitioning.
We introduce a novel GNN architecture, namely Graph Partitioning Neural Networks (GPNNs)
arXiv Detail & Related papers (2023-12-14T06:08:35Z) - Generalization Limits of Graph Neural Networks in Identity Effects
Learning [12.302336258860116]
Graph Neural Networks (GNNs) have emerged as a powerful tool for data-driven learning on various graph domains.
We establish new generalization properties and fundamental limits of GNNs in the context of learning so-called identity effects.
Our study is motivated by the need to understand the capabilities of GNNs when performing simple cognitive tasks.
arXiv Detail & Related papers (2023-06-30T20:56:38Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Theoretically Improving Graph Neural Networks via Anonymous Walk Graph
Kernels [25.736529232578178]
Graph neural networks (GNNs) have achieved tremendous success in graph mining.
MPGNNs, as the prevailing type of GNNs, have been theoretically shown unable to distinguish, detect or count many graph substructures.
We propose GSKN, a GNN model with a theoretically stronger ability to distinguish graph structures.
arXiv Detail & Related papers (2021-04-07T08:50:34Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Can Graph Neural Networks Count Substructures? [53.256112515435355]
We study the power of graph neural networks (GNNs) via their ability to count attributed graph substructures.
We distinguish between two types of substructure counting: inducedsubgraph-count and subgraphcount-count, and both positive and negative answers for popular GNN architectures.
arXiv Detail & Related papers (2020-02-10T18:53:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.