On the Expressive Power of Geometric Graph Neural Networks
- URL: http://arxiv.org/abs/2301.09308v3
- Date: Sun, 3 Mar 2024 20:32:24 GMT
- Title: On the Expressive Power of Geometric Graph Neural Networks
- Authors: Chaitanya K. Joshi, Cristian Bodnar, Simon V. Mathis, Taco Cohen,
Pietro Li\`o
- Abstract summary: We propose a geometric version of the WL test (GWL) for discriminating geometric graphs while respecting the underlying physical symmetries.
We unpack how key design choices influence geometric GNN expressivity.
- Score: 18.569063436109044
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The expressive power of Graph Neural Networks (GNNs) has been studied
extensively through the Weisfeiler-Leman (WL) graph isomorphism test. However,
standard GNNs and the WL framework are inapplicable for geometric graphs
embedded in Euclidean space, such as biomolecules, materials, and other
physical systems. In this work, we propose a geometric version of the WL test
(GWL) for discriminating geometric graphs while respecting the underlying
physical symmetries: permutations, rotation, reflection, and translation. We
use GWL to characterise the expressive power of geometric GNNs that are
invariant or equivariant to physical symmetries in terms of distinguishing
geometric graphs. GWL unpacks how key design choices influence geometric GNN
expressivity: (1) Invariant layers have limited expressivity as they cannot
distinguish one-hop identical geometric graphs; (2) Equivariant layers
distinguish a larger class of graphs by propagating geometric information
beyond local neighbourhoods; (3) Higher order tensors and scalarisation enable
maximally powerful geometric GNNs; and (4) GWL's discrimination-based
perspective is equivalent to universal approximation. Synthetic experiments
supplementing our results are available at
\url{https://github.com/chaitjo/geometric-gnn-dojo}
Related papers
- Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - On the Expressive Power of Sparse Geometric MPNNs [3.396731589928944]
We study the expressive power of message-passing neural networks for geometric graphs.
We show that generic pairs of non-isomorphic geometric graphs can be separated by message-passing networks.
arXiv Detail & Related papers (2024-07-02T07:48:22Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and
Applications [67.33002207179923]
This paper presents a survey of data structures, models, and applications related to geometric GNNs.
We provide a unified view of existing models from the geometric message passing perspective.
We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Is Distance Matrix Enough for Geometric Deep Learning? [24.307433184938127]
We show that Vanilla DisGNN is geometrically incomplete.
We then propose $k$-DisGNNs, which can effectively exploit the rich geometry contained in the distance matrix.
Our $k$-DisGNNs achieve many new state-of-the-art results on MD17.
arXiv Detail & Related papers (2023-02-11T16:54:20Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Geometrically Equivariant Graph Neural Networks: A Survey [44.73146997637709]
We analyze and classify existing methods into three groups regarding how the message passing and aggregation in GNNs are represented.
We also summarize the benchmarks as well as the related datasets to facilitate later researches for methodology development and experimental evaluation.
arXiv Detail & Related papers (2022-02-15T07:12:21Z) - MGNN: Graph Neural Networks Inspired by Distance Geometry Problem [28.789684784093048]
Graph Neural Networks (GNNs) have emerged as a prominent research topic in the field of machine learning.
In this paper, we propose a GNN model inspired by the congruent-inphilic property of the classifiers in the classification phase of GNNs.
We extensively evaluate the effectiveness of our model through experiments conducted on both synthetic and real-world datasets.
arXiv Detail & Related papers (2022-01-31T04:15:42Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.