From Relational Pooling to Subgraph GNNs: A Universal Framework for More
Expressive Graph Neural Networks
- URL: http://arxiv.org/abs/2305.04963v1
- Date: Mon, 8 May 2023 18:00:50 GMT
- Title: From Relational Pooling to Subgraph GNNs: A Universal Framework for More
Expressive Graph Neural Networks
- Authors: Cai Zhou, Xiyuan Wang, Muhan Zhang
- Abstract summary: We show how to assign labels to nodes to improve expressive power of message passing neural networks.
We experimentally prove that our method is universally compatible and capable of improving the expressivity of any base GNN model.
Our $k,l$-GNNs achieve superior performance on many synthetic and real-world datasets.
- Score: 8.121462458089141
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Relational pooling is a framework for building more expressive and
permutation-invariant graph neural networks. However, there is limited
understanding of the exact enhancement in the expressivity of RP and its
connection with the Weisfeiler Lehman hierarchy. Starting from RP, we propose
to explicitly assign labels to nodes as additional features to improve
expressive power of message passing neural networks. The method is then
extended to higher dimensional WL, leading to a novel $k,l$-WL algorithm, a
more general framework than $k$-WL. Theoretically, we analyze the expressivity
of $k,l$-WL with respect to $k$ and $l$ and unifies it with a great number of
subgraph GNNs. Complexity reduction methods are also systematically discussed
to build powerful and practical $k,l$-GNN instances. We theoretically and
experimentally prove that our method is universally compatible and capable of
improving the expressivity of any base GNN model. Our $k,l$-GNNs achieve
superior performance on many synthetic and real-world datasets, which verifies
the effectiveness of our framework.
Related papers
- Improving the Expressiveness of $K$-hop Message-Passing GNNs by Injecting Contextualized Substructure Information [17.56609419806051]
We propose textitsubstructure encoding function to uplift the expressive power of any $K$-hop message-passing GNN.
Our method is provably more powerful than previous works on $K$-hop graph neural networks and 1-WL subgraph GNNs.
arXiv Detail & Related papers (2024-06-27T15:10:56Z) - MAG-GNN: Reinforcement Learning Boosted Graph Neural Network [68.60884768323739]
A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success.
Such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs.
We propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem.
arXiv Detail & Related papers (2023-10-29T20:32:21Z) - Equivariant Polynomials for Graph Neural Networks [38.15983687193912]
Graph Networks (GNN) are inherently limited in their expressive power.
This paper introduces an alternative power hierarchy based on the ability of GNNs to calculate equivariants of certain degree.
These enhanced GNNs demonstrate state-of-the-art results in experiments across multiple graph learning benchmarks.
arXiv Detail & Related papers (2023-02-22T18:53:38Z) - A Complete Expressiveness Hierarchy for Subgraph GNNs via Subgraph
Weisfeiler-Lehman Tests [30.558563815430418]
Subgraph neural networks (GNNs) have emerged as an important direction for developing expressive graph neural networks (GNNs)
This paper conducts a systematic study of general node-based subgraph GNNs through the lens of Subgraph Weisfeiler-Lehman Tests (SWL)
We prove that any node-based subgraph GNN falls into one of the six SWL equivalence classes, among which $mathsfSSWL$ achieves the maximal expressive power.
arXiv Detail & Related papers (2023-02-14T14:42:54Z) - Rethinking the Expressive Power of GNNs via Graph Biconnectivity [45.4674360883544]
We introduce a novel class of expressivity metrics via graph biconnectivity and highlight their importance in both theory and practice.
We introduce a principled and more efficient approach, called the Generalized Distance Weisfeiler-Lehman (GD-WL)
Practically, we show GD-WL can be implemented by a Transformer-like architecture that preserves and enjoys full parallelizability.
arXiv Detail & Related papers (2023-01-23T15:58:59Z) - A Practical, Progressively-Expressive GNN [27.267362661067285]
Message passing neural networks (MPNNs) have become a dominant flavor of graph neural networks (GNNs) in recent years.
MPNNs come with notable limitations; they are at most as powerful as the 1-dimensional Weisfeiler-Leman (1-WL) test in distinguishing graphs in a graph isomorphism testing frame-work.
We propose the (k, c)(=)-SETWL hierarchy with greatly reduced complexity from k-WL, achieved by moving from k-tuples of nodes to sets with =k nodes defined over =c connected components in
arXiv Detail & Related papers (2022-10-18T01:27:21Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - Optimization and Generalization Analysis of Transduction through
Gradient Boosting and Application to Multi-scale Graph Neural Networks [60.22494363676747]
It is known that the current graph neural networks (GNNs) are difficult to make themselves deep due to the problem known as over-smoothing.
Multi-scale GNNs are a promising approach for mitigating the over-smoothing problem.
We derive the optimization and generalization guarantees of transductive learning algorithms that include multi-scale GNNs.
arXiv Detail & Related papers (2020-06-15T17:06:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.