Beyond Weisfeiler-Lehman: A Quantitative Framework for GNN
Expressiveness
- URL: http://arxiv.org/abs/2401.08514v1
- Date: Tue, 16 Jan 2024 17:23:23 GMT
- Title: Beyond Weisfeiler-Lehman: A Quantitative Framework for GNN
Expressiveness
- Authors: Bohang Zhang, Jingchu Gai, Yiheng Du, Qiwei Ye, Di He, Liwei Wang
- Abstract summary: Homomorphism expressivity measures the ability of GNN models to count graphs under homomorphism.
By examining four classes of prominent GNNs as case studies, we derive simple, unified, and elegant descriptions of their homomorphism expressivity.
Our results provide novel insights into a series of previous work, unify the landscape of different subareas in the community, and settle several open questions.
- Score: 35.409017863665575
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Designing expressive Graph Neural Networks (GNNs) is a fundamental topic in
the graph learning community. So far, GNN expressiveness has been primarily
assessed via the Weisfeiler-Lehman (WL) hierarchy. However, such an
expressivity measure has notable limitations: it is inherently coarse,
qualitative, and may not well reflect practical requirements (e.g., the ability
to encode substructures). In this paper, we introduce a unified framework for
quantitatively studying the expressiveness of GNN architectures, addressing all
the above limitations. Specifically, we identify a fundamental expressivity
measure termed homomorphism expressivity, which quantifies the ability of GNN
models to count graphs under homomorphism. Homomorphism expressivity offers a
complete and practical assessment tool: the completeness enables direct
expressivity comparisons between GNN models, while the practicality allows for
understanding concrete GNN abilities such as subgraph counting. By examining
four classes of prominent GNNs as case studies, we derive simple, unified, and
elegant descriptions of their homomorphism expressivity for both invariant and
equivariant settings. Our results provide novel insights into a series of
previous work, unify the landscape of different subareas in the community, and
settle several open questions. Empirically, extensive experiments on both
synthetic and real-world tasks verify our theory, showing that the practical
performance of GNN models aligns well with the proposed metric.
Related papers
- Towards Bridging Generalization and Expressivity of Graph Neural Networks [11.560730203511111]
We study the relationship between expressivity and generalization in graph neural networks (GNNs)
We introduce a novel framework that connects GNN generalization to the variance in graph structures they can capture.
We uncover a trade-off between intra-class concentration and inter-class separation, both of which are crucial for effective generalization.
arXiv Detail & Related papers (2024-10-14T00:31:16Z) - Fine-Grained Expressive Power of Weisfeiler-Leman: A Homomorphism Counting Perspective [24.729126775414922]
We provide a theoretical framework to determine the homomorphism counting power of an arbitrary class of graph neural networks (GNNs)
As the considered design space is large enough to accommodate almost all known powerful GNNs, our result greatly extends all existing works, and may find its application in the automation of GNN model design.
arXiv Detail & Related papers (2024-10-04T15:36:48Z) - On the Expressive Power of Graph Neural Networks [0.0]
Graph Neural Networks (GNNs) can solve a diverse set of tasks in fields including social science, chemistry, and medicine.
By extending deep learning to graph-structured data, GNNs can solve a diverse set of tasks in fields including social science, chemistry, and medicine.
arXiv Detail & Related papers (2024-01-03T08:54:56Z) - Generalization Limits of Graph Neural Networks in Identity Effects
Learning [12.302336258860116]
Graph Neural Networks (GNNs) have emerged as a powerful tool for data-driven learning on various graph domains.
We establish new generalization properties and fundamental limits of GNNs in the context of learning so-called identity effects.
Our study is motivated by the need to understand the capabilities of GNNs when performing simple cognitive tasks.
arXiv Detail & Related papers (2023-06-30T20:56:38Z) - A Complete Expressiveness Hierarchy for Subgraph GNNs via Subgraph
Weisfeiler-Lehman Tests [30.558563815430418]
Subgraph neural networks (GNNs) have emerged as an important direction for developing expressive graph neural networks (GNNs)
This paper conducts a systematic study of general node-based subgraph GNNs through the lens of Subgraph Weisfeiler-Lehman Tests (SWL)
We prove that any node-based subgraph GNN falls into one of the six SWL equivalence classes, among which $mathsfSSWL$ achieves the maximal expressive power.
arXiv Detail & Related papers (2023-02-14T14:42:54Z) - Graph Neural Networks are Inherently Good Generalizers: Insights by
Bridging GNNs and MLPs [71.93227401463199]
This paper pinpoints the major source of GNNs' performance gain to their intrinsic capability, by introducing an intermediate model class dubbed as P(ropagational)MLP.
We observe that PMLPs consistently perform on par with (or even exceed) their GNN counterparts, while being much more efficient in training.
arXiv Detail & Related papers (2022-12-18T08:17:32Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z) - Node Masking: Making Graph Neural Networks Generalize and Scale Better [71.51292866945471]
Graph Neural Networks (GNNs) have received a lot of interest in the recent times.
In this paper, we utilize some theoretical tools to better visualize the operations performed by state of the art spatial GNNs.
We introduce a simple concept, Node Masking, that allows them to generalize and scale better.
arXiv Detail & Related papers (2020-01-17T06:26:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.