The expressive power of pooling in Graph Neural Networks
- URL: http://arxiv.org/abs/2304.01575v3
- Date: Thu, 12 Oct 2023 19:55:24 GMT
- Title: The expressive power of pooling in Graph Neural Networks
- Authors: Filippo Maria Bianchi, Veronica Lachi
- Abstract summary: We study how graph pooling affects the expressiveness of a Graph Neural Networks (GNNs)
We derive sufficient conditions for a pooling operator to fully preserve the expressive power of the MP layers before it.
We introduce an experimental setup to verify empirically the expressive power of a GNN equipped with pooling layers.
- Score: 6.5268245109828005
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In Graph Neural Networks (GNNs), hierarchical pooling operators generate
local summaries of the data by coarsening the graph structure and the vertex
features. While considerable attention has been devoted to analyzing the
expressive power of message-passing (MP) layers in GNNs, a study on how graph
pooling affects the expressiveness of a GNN is still lacking. Additionally,
despite the recent advances in the design of pooling operators, there is not a
principled criterion to compare them. In this work, we derive sufficient
conditions for a pooling operator to fully preserve the expressive power of the
MP layers before it. These conditions serve as a universal and theoretically
grounded criterion for choosing among existing pooling operators or designing
new ones. Based on our theoretical findings, we analyze several existing
pooling operators and identify those that fail to satisfy the expressiveness
conditions. Finally, we introduce an experimental setup to verify empirically
the expressive power of a GNN equipped with pooling layers, in terms of its
capability to perform a graph isomorphism test.
Related papers
- Graph Classification with GNNs: Optimisation, Representation and Inductive Bias [0.6445605125467572]
We argue that such equivalence ignores the accompanying optimization issues and does not provide a holistic view of the GNN learning process.
We prove theoretically that the message-passing layers in the graph, have a tendency to search for either discriminative subgraphs, or a collection of discriminative nodes dispersed across the graph.
arXiv Detail & Related papers (2024-08-17T18:15:44Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Higher-order Clustering and Pooling for Graph Neural Networks [77.47617360812023]
Graph Neural Networks achieve state-of-the-art performance on a plethora of graph classification tasks.
HoscPool is a clustering-based graph pooling operator that captures higher-order information hierarchically.
We evaluate HoscPool on graph classification tasks and its clustering component on graphs with ground-truth community structure.
arXiv Detail & Related papers (2022-09-02T09:17:10Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Breaking the Expressive Bottlenecks of Graph Neural Networks [26.000304641965947]
Recently, the Weisfeiler-Lehman (WL) graph isomorphism test was used to measure the expressiveness of graph neural networks (GNNs)
In this paper, we improve the expressiveness by exploring powerful aggregators.
arXiv Detail & Related papers (2020-12-14T02:36:46Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Rethinking pooling in graph neural networks [12.168949038217889]
We study the interplay between convolutional layers and the subsequent pooling ones.
In contrast to the common belief, local pooling is not responsible for the success of GNNs on relevant and widely-used benchmarks.
arXiv Detail & Related papers (2020-10-22T03:48:56Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - SimPool: Towards Topology Based Graph Pooling with Structural Similarity
Features [0.0]
This paper proposes two main contributions, the first is a differential module calculating structural similarity features based on the adjacency matrix.
The second main contribution is on integrating these features with a revisited pooling layer DiffPool arXiv:1806.08804 to propose a pooling layer referred to as SimPool.
Experimental results demonstrate that as part of an end-to-end Graph Neural Network architecture SimPool calculates node cluster assignments that resemble more to the locality.
arXiv Detail & Related papers (2020-06-03T12:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.