Weisfeiler and Lehman Go Cellular: CW Networks
- URL: http://arxiv.org/abs/2106.12575v1
- Date: Wed, 23 Jun 2021 17:59:16 GMT
- Title: Weisfeiler and Lehman Go Cellular: CW Networks
- Authors: Cristian Bodnar, Fabrizio Frasca, Nina Otter, Yu Guang Wang, Pietro
Li\`o, Guido Mont\'ufar, Michael Bronstein
- Abstract summary: Graph Neural Networks (GNNs) are limited in their expressive power, struggle with long-range interactions and lack a principled way to model higher-order structures.
We extend recent theoretical results on SCs to regular Cell Complexes, topological objects that flexibly subsume SCs and graphs.
We show that this generalisation provides a powerful set of graph lifting'' transformations, each leading to a unique message passing procedure.
- Score: 3.0017241250121383
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) are limited in their expressive power, struggle
with long-range interactions and lack a principled way to model higher-order
structures. These problems can be attributed to the strong coupling between the
computational graph and the input graph structure. The recently proposed
Message Passing Simplicial Networks naturally decouple these elements by
performing message passing on the clique complex of the graph. Nevertheless,
these models are severely constrained by the rigid combinatorial structure of
Simplicial Complexes (SCs). In this work, we extend recent theoretical results
on SCs to regular Cell Complexes, topological objects that flexibly subsume SCs
and graphs. We show that this generalisation provides a powerful set of graph
``lifting'' transformations, each leading to a unique hierarchical message
passing procedure. The resulting methods, which we collectively call CW
Networks (CWNs), are strictly more powerful than the WL test and, in certain
cases, not less powerful than the 3-WL test. In particular, we demonstrate the
effectiveness of one such scheme, based on rings, when applied to molecular
graph problems. The proposed architecture benefits from provably larger
expressivity than commonly used GNNs, principled modelling of higher-order
signals and from compressing the distances between nodes. We demonstrate that
our model achieves state-of-the-art results on a variety of molecular datasets.
Related papers
- A Practical, Progressively-Expressive GNN [27.267362661067285]
Message passing neural networks (MPNNs) have become a dominant flavor of graph neural networks (GNNs) in recent years.
MPNNs come with notable limitations; they are at most as powerful as the 1-dimensional Weisfeiler-Leman (1-WL) test in distinguishing graphs in a graph isomorphism testing frame-work.
We propose the (k, c)(=)-SETWL hierarchy with greatly reduced complexity from k-WL, achieved by moving from k-tuples of nodes to sets with =k nodes defined over =c connected components in
arXiv Detail & Related papers (2022-10-18T01:27:21Z) - Cell Attention Networks [25.72671436731666]
We introduce Cell Attention Networks (CANs), a neural architecture operating on data defined over the vertices of a graph.
CANs exploit the lower and upper neighborhoods, as encoded in the cell complex, to design two independent masked self-attention mechanisms.
The experimental results show that CAN is a low complexity strategy that compares favorably with state of the art results on graph-based learning tasks.
arXiv Detail & Related papers (2022-09-16T21:57:39Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Reasoning Graph Networks for Kinship Verification: from Star-shaped to
Hierarchical [85.0376670244522]
We investigate the problem of facial kinship verification by learning hierarchical reasoning graph networks.
We develop a Star-shaped Reasoning Graph Network (S-RGN) to exploit more powerful and flexible capacity.
We also develop a Hierarchical Reasoning Graph Network (H-RGN) to exploit more powerful and flexible capacity.
arXiv Detail & Related papers (2021-09-06T03:16:56Z) - Weisfeiler and Lehman Go Topological: Message Passing Simplicial
Networks [3.0017241250121383]
We propose Message Passing Simplicial Networks (MPSNs)
MPSNs are models that perform message passing on simplicial complexes (SCs)
We empirically support our theoretical claims by showing that MPSNs can distinguish challenging strongly regular graphs for which GNNs fail.
arXiv Detail & Related papers (2021-03-04T18:35:24Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.