Subgraph Neural Networks
- URL: http://arxiv.org/abs/2006.10538v3
- Date: Fri, 6 Nov 2020 20:56:21 GMT
- Title: Subgraph Neural Networks
- Authors: Emily Alsentzer, Samuel G. Finlayson, Michelle M. Li, Marinka Zitnik
- Abstract summary: We introduce SubGNN, a subgraph neural network to learn disentangled subgraph representations.
SubGNN performs exceptionally well on challenging biomedical datasets.
- Score: 14.222887950206662
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning methods for graphs achieve remarkable performance on many
node-level and graph-level prediction tasks. However, despite the proliferation
of the methods and their success, prevailing Graph Neural Networks (GNNs)
neglect subgraphs, rendering subgraph prediction tasks challenging to tackle in
many impactful applications. Further, subgraph prediction tasks present several
unique challenges: subgraphs can have non-trivial internal topology, but also
carry a notion of position and external connectivity information relative to
the underlying graph in which they exist. Here, we introduce SubGNN, a subgraph
neural network to learn disentangled subgraph representations. We propose a
novel subgraph routing mechanism that propagates neural messages between the
subgraph's components and randomly sampled anchor patches from the underlying
graph, yielding highly accurate subgraph representations. SubGNN specifies
three channels, each designed to capture a distinct aspect of subgraph
topology, and we provide empirical evidence that the channels encode their
intended properties. We design a series of new synthetic and real-world
subgraph datasets. Empirical results for subgraph classification on eight
datasets show that SubGNN achieves considerable performance gains,
outperforming strong baseline methods, including node-level and graph-level
GNNs, by 19.8% over the strongest baseline. SubGNN performs exceptionally well
on challenging biomedical datasets where subgraphs have complex topology and
even comprise multiple disconnected components.
Related papers
- A Flexible, Equivariant Framework for Subgraph GNNs via Graph Products and Graph Coarsening [18.688057947275112]
Subgraph Graph Neural Networks (Subgraph GNNs) enhance the expressivity of message-passing GNNs by representing graphs as sets of subgraphs.
Previous approaches suggested processing only subsets of subgraphs, selected either randomly or via learnable sampling.
This paper introduces a new Subgraph GNNs framework to address these issues.
arXiv Detail & Related papers (2024-06-13T16:29:06Z) - SPGNN: Recognizing Salient Subgraph Patterns via Enhanced Graph Convolution and Pooling [25.555741218526464]
Graph neural networks (GNNs) have revolutionized the field of machine learning on non-Euclidean data such as graphs and networks.
We propose a concatenation-based graph convolution mechanism that injectively updates node representations.
We also design a novel graph pooling module, called WL-SortPool, to learn important subgraph patterns in a deep-learning manner.
arXiv Detail & Related papers (2024-04-21T13:11:59Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - MAG-GNN: Reinforcement Learning Boosted Graph Neural Network [68.60884768323739]
A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success.
Such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs.
We propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem.
arXiv Detail & Related papers (2023-10-29T20:32:21Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Stochastic Subgraph Neighborhood Pooling for Subgraph Classification [2.1270496914042996]
Subgraph Neighborhood Pooling (SSNP) jointly aggregates the subgraph and its neighborhood information without any computationally expensive operations such as labeling tricks.
Our experiments demonstrate that our models outperform current state-of-the-art methods (with a margin of up to 2%) while being up to 3X faster in training.
arXiv Detail & Related papers (2023-04-17T18:49:18Z) - Position-Aware Subgraph Neural Networks with Data-Efficient Learning [15.58680146160525]
We propose a Position-Aware Data-Efficient Learning framework for subgraph neural networks called PADEL.
Specifically, we propose a novel node position encoding method that is anchor-free, and design a new generative subgraph augmentation method based on a diffused variational subgraph autoencoder.
arXiv Detail & Related papers (2022-11-01T16:34:42Z) - Ordered Subgraph Aggregation Networks [19.18478955240166]
Subgraph-enhanced graph neural networks (GNNs) have emerged, provably boosting the expressive power of standard (message-passing) GNNs.
Here, we introduce a theoretical framework and extend the known expressivity results of subgraph-enhanced GNNs.
We show that increasing subgraph size always increases the expressive power and develop a better understanding of their limitations.
arXiv Detail & Related papers (2022-06-22T15:19:34Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.