Graph Neural Networks with Parallel Neighborhood Aggregations for Graph
Classification
- URL: http://arxiv.org/abs/2111.11482v1
- Date: Mon, 22 Nov 2021 19:19:40 GMT
- Title: Graph Neural Networks with Parallel Neighborhood Aggregations for Graph
Classification
- Authors: Siddhant Doshi and Sundeep Prabhakar Chepuri
- Abstract summary: We focus on graph classification using a graph neural network (GNN) model that precomputes the node features using a bank of neighborhood aggregation graph operators arranged in parallel.
These GNN models have a natural advantage of reduced training and inference time due to the precomputations.
We demonstrate via numerical experiments that the developed model achieves state-of-the-art performance on many diverse real-world datasets.
- Score: 14.112444998191698
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We focus on graph classification using a graph neural network (GNN) model
that precomputes the node features using a bank of neighborhood aggregation
graph operators arranged in parallel. These GNN models have a natural advantage
of reduced training and inference time due to the precomputations but are also
fundamentally different from popular GNN variants that update node features
through a sequential neighborhood aggregation procedure during training. We
provide theoretical conditions under which a generic GNN model with parallel
neighborhood aggregations (PA-GNNs, in short) are provably as powerful as the
well-known Weisfeiler-Lehman (WL) graph isomorphism test in discriminating
non-isomorphic graphs. Although PA-GNN models do not have an apparent
relationship with the WL test, we show that the graph embeddings obtained from
these two methods are injectively related. We then propose a specialized PA-GNN
model, called SPIN, which obeys the developed conditions. We demonstrate via
numerical experiments that the developed model achieves state-of-the-art
performance on many diverse real-world datasets while maintaining the
discriminative power of the WL test and the computational advantage of
preprocessing graphs before the training process.
Related papers
- Higher-Order GNNs Meet Efficiency: Sparse Sobolev Graph Neural Networks [6.080095317098909]
Graph Neural Networks (GNNs) have shown great promise in modeling relationships between nodes in a graph.
Previous studies have primarily attempted to utilize the information from higher-order neighbors in the graph.
We make a fundamental observation: the regular and the Hadamard power of the Laplacian matrix behave similarly in the spectrum.
We propose a novel graph convolutional operator based on the sparse Sobolev norm of graph signals.
arXiv Detail & Related papers (2024-11-07T09:53:11Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - Graph Neural Network-Inspired Kernels for Gaussian Processes in
Semi-Supervised Learning [4.644263115284322]
Graph neural networks (GNNs) emerged recently as a promising class of models for graph-structured data in semi-supervised learning.
We introduce this inductive bias into GPs to improve their predictive performance for graph-structured data.
We show that these graph-based kernels lead to competitive classification and regression performance, as well as advantages in time, compared with the respective GNNs.
arXiv Detail & Related papers (2023-02-12T01:07:56Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Implicit Graph Neural Networks [46.0589136729616]
We propose a graph learning framework called Implicit Graph Neural Networks (IGNN)
IGNNs consistently capture long-range dependencies and outperform state-of-the-art GNN models.
arXiv Detail & Related papers (2020-09-14T06:04:55Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.