Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results
- URL: http://arxiv.org/abs/2012.03174v1
- Date: Sun, 6 Dec 2020 03:42:54 GMT
- Title: Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results
- Authors: Behrooz Tahmasebi, Stefanie Jegelka
- Abstract summary: We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
- Score: 58.277290855841976
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: While massage passing based Graph Neural Networks (GNNs) have become
increasingly popular architectures for learning with graphs, recent works have
revealed important shortcomings in their expressive power. In response, several
higher-order GNNs have been proposed, which substantially increase the
expressive power, but at a large computational cost. Motivated by this gap, we
introduce and analyze a new recursive pooling technique of local neighborhoods
that allows different tradeoffs of computational cost and expressive power.
First, we show that this model can count subgraphs of size $k$, and thereby
overcomes a known limitation of low-order GNNs. Second, we prove that, in
several cases, the proposed algorithm can greatly reduce computational
complexity compared to the existing higher-order $k$-GNN and Local Relational
Pooling (LRP) networks. We also provide a (near) matching information-theoretic
lower bound for graph representations that can provably count subgraphs, and
discuss time complexity lower bounds as well.
Related papers
- SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - MAG-GNN: Reinforcement Learning Boosted Graph Neural Network [68.60884768323739]
A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success.
Such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs.
We propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem.
arXiv Detail & Related papers (2023-10-29T20:32:21Z) - Factor Graph Neural Networks [20.211455592922736]
Graph Neural Networks (GNNs) can learn powerful representations in an end-to-end fashion with great success in many real-world applications.
We propose Factor Graph Neural Networks (FGNNs) to effectively capture higher-order relations for inference and learning.
arXiv Detail & Related papers (2023-08-02T00:32:02Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Decoupling the Depth and Scope of Graph Neural Networks [21.273445344654597]
State-of-the-art Graph Neural Networks (GNNs) have limited scalability with respect to the graph and model sizes.
We propose a design principle to decouple the depth and scope of GNNs.
Our design achieves significant accuracy improvement with orders of magnitude reduction in computation and hardware cost.
arXiv Detail & Related papers (2022-01-19T20:52:42Z) - Weisfeiler and Lehman Go Cellular: CW Networks [3.0017241250121383]
Graph Neural Networks (GNNs) are limited in their expressive power, struggle with long-range interactions and lack a principled way to model higher-order structures.
We extend recent theoretical results on SCs to regular Cell Complexes, topological objects that flexibly subsume SCs and graphs.
We show that this generalisation provides a powerful set of graph lifting'' transformations, each leading to a unique message passing procedure.
arXiv Detail & Related papers (2021-06-23T17:59:16Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.