Weisfeiler and Lehman Go Topological: Message Passing Simplicial
Networks
- URL: http://arxiv.org/abs/2103.03212v1
- Date: Thu, 4 Mar 2021 18:35:24 GMT
- Title: Weisfeiler and Lehman Go Topological: Message Passing Simplicial
Networks
- Authors: Cristian Bodnar, Fabrizio Frasca, Yu Guang Wang, Nina Otter, Guido
Mont\'ufar, Pietro Li\`o, Michael Bronstein
- Abstract summary: We propose Message Passing Simplicial Networks (MPSNs)
MPSNs are models that perform message passing on simplicial complexes (SCs)
We empirically support our theoretical claims by showing that MPSNs can distinguish challenging strongly regular graphs for which GNNs fail.
- Score: 3.0017241250121383
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The pairwise interaction paradigm of graph machine learning has predominantly
governed the modelling of relational systems. However, graphs alone cannot
capture the multi-level interactions present in many complex systems and the
expressive power of such schemes was proven to be limited. To overcome these
limitations, we propose Message Passing Simplicial Networks (MPSNs), a class of
models that perform message passing on simplicial complexes (SCs) - topological
objects generalising graphs to higher dimensions. To theoretically analyse the
expressivity of our model we introduce a Simplicial Weisfeiler-Lehman (SWL)
colouring procedure for distinguishing non-isomorphic SCs. We relate the power
of SWL to the problem of distinguishing non-isomorphic graphs and show that SWL
and MPSNs are strictly more powerful than the WL test and not less powerful
than the 3-WL test. We deepen the analysis by comparing our model with
traditional graph neural networks with ReLU activations in terms of the number
of linear regions of the functions they can represent. We empirically support
our theoretical claims by showing that MPSNs can distinguish challenging
strongly regular graphs for which GNNs fail and, when equipped with orientation
equivariant layers, they can improve classification accuracy in oriented SCs
compared to a GNN baseline. Additionally, we implement a library for message
passing on simplicial complexes that we envision to release in due course.
Related papers
- Higher-order Graph Convolutional Network with Flower-Petals Laplacians
on Simplicial Complexes [5.838832985156104]
We present a higher-order Flower-Petals (FP) model, incorporating FP Laplacians into simplicial complexes (SCs)
We also introduce a Higher-order Graph Convolutional Network (HiGCN) grounded in FP Laplacians, capable of discerning intrinsic features across varying topological scales.
HiGCN accomplishes state-of-the-art performance on a range of graph tasks and provides a scalable and flexible solution to explore higher-order interactions in graphs.
arXiv Detail & Related papers (2023-09-22T16:11:17Z) - Equivariant Hypergraph Neural Networks [15.096429849049953]
Recent approaches for hypergraph learning extend graph neural networks based on message passing, which is simple yet fundamentally limited in modeling long-range dependencies and expressive power.
We present Equivariant Hypergraph Neural Network (EHNN), the first attempt to realize maximally expressive equivariant layers for general hypergraph learning.
We demonstrate their capability in a range of hypergraph learning problems, including synthetic k-edge identification, semi-supervised classification, and visual keypoint matching, and report improved performances over strong message passing baselines.
arXiv Detail & Related papers (2022-08-22T16:28:38Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Weisfeiler and Lehman Go Cellular: CW Networks [3.0017241250121383]
Graph Neural Networks (GNNs) are limited in their expressive power, struggle with long-range interactions and lack a principled way to model higher-order structures.
We extend recent theoretical results on SCs to regular Cell Complexes, topological objects that flexibly subsume SCs and graphs.
We show that this generalisation provides a powerful set of graph lifting'' transformations, each leading to a unique message passing procedure.
arXiv Detail & Related papers (2021-06-23T17:59:16Z) - Breaking the Expressive Bottlenecks of Graph Neural Networks [26.000304641965947]
Recently, the Weisfeiler-Lehman (WL) graph isomorphism test was used to measure the expressiveness of graph neural networks (GNNs)
In this paper, we improve the expressiveness by exploring powerful aggregators.
arXiv Detail & Related papers (2020-12-14T02:36:46Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.