Understanding and Extending Subgraph GNNs by Rethinking Their Symmetries
- URL: http://arxiv.org/abs/2206.11140v1
- Date: Wed, 22 Jun 2022 14:35:47 GMT
- Title: Understanding and Extending Subgraph GNNs by Rethinking Their Symmetries
- Authors: Fabrizio Frasca, Beatrice Bevilacqua, Michael M. Bronstein, Haggai
Maron
- Abstract summary: Subgraph GNNs are a recent class of expressive Graph Neural Networks (GNNs) which model graphs as collections of subgraphs.
We study the most prominent form of subgraph methods, which employs node-based subgraph selection policies.
We propose a general family of message-passing layers for subgraph methods that generalises all previous node-based Subgraph GNNs.
- Score: 33.07812045457703
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Subgraph GNNs are a recent class of expressive Graph Neural Networks (GNNs)
which model graphs as collections of subgraphs. So far, the design space of
possible Subgraph GNN architectures as well as their basic theoretical
properties are still largely unexplored. In this paper, we study the most
prominent form of subgraph methods, which employs node-based subgraph selection
policies such as ego-networks or node marking and deletion. We address two
central questions: (1) What is the upper-bound of the expressive power of these
methods? and (2) What is the family of equivariant message passing layers on
these sets of subgraphs?. Our first step in answering these questions is a
novel symmetry analysis which shows that modelling the symmetries of node-based
subgraph collections requires a significantly smaller symmetry group than the
one adopted in previous works. This analysis is then used to establish a link
between Subgraph GNNs and Invariant Graph Networks (IGNs). We answer the
questions above by first bounding the expressive power of subgraph methods by
3-WL, and then proposing a general family of message-passing layers for
subgraph methods that generalises all previous node-based Subgraph GNNs.
Finally, we design a novel Subgraph GNN dubbed SUN, which theoretically unifies
previous architectures while providing better empirical performance on multiple
benchmarks.
Related papers
- Graph Neural Networks on Discriminative Graphs of Words [19.817473565906777]
In this work, we explore a new Discriminative Graph of Words Graph Neural Network (DGoW-GNN) approach to classify text.
We propose a new model for the graph-based classification of text, which combines a GNN and a sequence model.
We evaluate our approach on seven benchmark datasets and find that it is outperformed by several state-of-the-art baseline models.
arXiv Detail & Related papers (2024-10-27T15:14:06Z) - Global Graph Counterfactual Explanation: A Subgraph Mapping Approach [54.42907350881448]
Graph Neural Networks (GNNs) have been widely deployed in various real-world applications.
Counterfactual explanation aims to find minimum perturbations on input graphs that change the GNN predictions.
We propose GlobalGCE, a novel global-level graph counterfactual explanation method.
arXiv Detail & Related papers (2024-10-25T21:39:05Z) - Hyperedge Modeling in Hypergraph Neural Networks by using Densest Overlapping Subgraphs [0.0]
One of the most important problems in graph clustering is to find densest overlapping subgraphs (DOS)
In this paper, we propose a solution to the DOS problem via Agglomerativedyion (DOSAGE) algorithm as a novel approach to enhance the process of generating the densest overlapping subgraphs.
Experiments on standard benchmarks show that the DOSAGE algorithm significantly outperforms the HGNNs and six other methods on the node classification task.
arXiv Detail & Related papers (2024-09-16T14:56:10Z) - A Flexible, Equivariant Framework for Subgraph GNNs via Graph Products and Graph Coarsening [18.688057947275112]
Subgraph Graph Neural Networks (Subgraph GNNs) enhance the expressivity of message-passing GNNs by representing graphs as sets of subgraphs.
Previous approaches suggested processing only subsets of subgraphs, selected either randomly or via learnable sampling.
This paper introduces a new Subgraph GNNs framework to address these issues.
arXiv Detail & Related papers (2024-06-13T16:29:06Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Stochastic Subgraph Neighborhood Pooling for Subgraph Classification [2.1270496914042996]
Subgraph Neighborhood Pooling (SSNP) jointly aggregates the subgraph and its neighborhood information without any computationally expensive operations such as labeling tricks.
Our experiments demonstrate that our models outperform current state-of-the-art methods (with a margin of up to 2%) while being up to 3X faster in training.
arXiv Detail & Related papers (2023-04-17T18:49:18Z) - Bring Your Own View: Graph Neural Networks for Link Prediction with
Personalized Subgraph Selection [57.34881616131377]
We introduce a Personalized Subgraph Selector (PS2) as a plug-and-play framework to automatically, personally, and inductively identify optimal subgraphs for different edges.
PS2 is instantiated as a bi-level optimization problem that can be efficiently solved differently.
We suggest a brand-new angle towards GNNLP training: by first identifying the optimal subgraphs for edges; and then focusing on training the inference model by using the sampled subgraphs.
arXiv Detail & Related papers (2022-12-23T17:30:19Z) - On Explainability of Graph Neural Networks via Subgraph Explorations [48.56936527708657]
We propose a novel method, known as SubgraphX, to explain graph neural networks (GNNs)
Our work represents the first attempt to explain GNNs via identifying subgraphs explicitly.
arXiv Detail & Related papers (2021-02-09T22:12:26Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.