Ordered Subgraph Aggregation Networks
- URL: http://arxiv.org/abs/2206.11168v1
- Date: Wed, 22 Jun 2022 15:19:34 GMT
- Title: Ordered Subgraph Aggregation Networks
- Authors: Chendi Qian, Gaurav Rattan, Floris Geerts, Christopher Morris, Mathias
Niepert
- Abstract summary: Subgraph-enhanced graph neural networks (GNNs) have emerged, provably boosting the expressive power of standard (message-passing) GNNs.
Here, we introduce a theoretical framework and extend the known expressivity results of subgraph-enhanced GNNs.
We show that increasing subgraph size always increases the expressive power and develop a better understanding of their limitations.
- Score: 19.18478955240166
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Numerous subgraph-enhanced graph neural networks (GNNs) have emerged
recently, provably boosting the expressive power of standard (message-passing)
GNNs. However, there is a limited understanding of how these approaches relate
to each other and to the Weisfeiler--Leman hierarchy. Moreover, current
approaches either use all subgraphs of a given size, sample them uniformly at
random, or use hand-crafted heuristics instead of learning to select subgraphs
in a data-driven manner. Here, we offer a unified way to study such
architectures by introducing a theoretical framework and extending the known
expressivity results of subgraph-enhanced GNNs. Concretely, we show that
increasing subgraph size always increases the expressive power and develop a
better understanding of their limitations by relating them to the established
$k\text{-}\mathsf{WL}$ hierarchy. In addition, we explore different approaches
for learning to sample subgraphs using recent methods for backpropagating
through complex discrete probability distributions. Empirically, we study the
predictive performance of different subgraph-enhanced GNNs, showing that our
data-driven architectures increase prediction accuracy on standard benchmark
datasets compared to non-data-driven subgraph-enhanced graph neural networks
while reducing computation time.
Related papers
- Enhanced Expressivity in Graph Neural Networks with Lanczos-Based Linear Constraints [7.605749412696919]
Graph Neural Networks (GNNs) excel in handling graph-structured data but often underperform in link prediction tasks.
We present a novel method to enhance the expressivity of GNNs by embedding induced subgraphs into the graph Laplacian matrix's eigenbasis.
Our method achieves 20x and 10x speedup by only requiring 5% and 10% data from the PubMed and OGBL-Vessel datasets.
arXiv Detail & Related papers (2024-08-22T12:22:00Z) - A Flexible, Equivariant Framework for Subgraph GNNs via Graph Products and Graph Coarsening [18.688057947275112]
Subgraph Graph Neural Networks (Subgraph GNNs) enhance the expressivity of message-passing GNNs by representing graphs as sets of subgraphs.
Previous approaches suggested processing only subsets of subgraphs, selected either randomly or via learnable sampling.
This paper introduces a new Subgraph GNNs framework to address these issues.
arXiv Detail & Related papers (2024-06-13T16:29:06Z) - Improving the interpretability of GNN predictions through conformal-based graph sparsification [9.550589670316523]
Graph Neural Networks (GNNs) have achieved state-of-the-art performance in solving graph classification tasks.
We propose a GNN emphtraining approach that finds the most predictive subgraph by removing edges and/or nodes.
We rely on reinforcement learning to solve the resulting bi-level optimization with a reward function based on conformal predictions.
arXiv Detail & Related papers (2024-04-18T17:34:47Z) - MAG-GNN: Reinforcement Learning Boosted Graph Neural Network [68.60884768323739]
A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success.
Such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs.
We propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem.
arXiv Detail & Related papers (2023-10-29T20:32:21Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Combining Stochastic Explainers and Subgraph Neural Networks can
Increase Expressivity and Interpretability [12.526174412246107]
Subgraph-enhanced graph neural networks (SGNN) can increase the power of the standard message-passing framework.
We introduce a novel framework that jointly predicts the class of the graph and a set of explanatory sparse subgraphs.
arXiv Detail & Related papers (2023-04-14T14:21:20Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Subgraph Neural Networks [14.222887950206662]
We introduce SubGNN, a subgraph neural network to learn disentangled subgraph representations.
SubGNN performs exceptionally well on challenging biomedical datasets.
arXiv Detail & Related papers (2020-06-18T13:54:30Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.