On Graph Neural Networks versus Graph-Augmented MLPs
- URL: http://arxiv.org/abs/2010.15116v2
- Date: Wed, 2 Dec 2020 16:46:23 GMT
- Title: On Graph Neural Networks versus Graph-Augmented MLPs
- Authors: Lei Chen, Zhengdao Chen, Joan Bruna
- Abstract summary: Graph-Augmented Multi-Layer Perceptrons (GA-MLPs) first augments node features with certain multi-hop operators on the graph.
We prove a separation in expressive power between GA-MLPs and GNNs that grows exponentially in depth.
- Score: 51.23890789522705
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: From the perspective of expressive power, this work compares multi-layer
Graph Neural Networks (GNNs) with a simplified alternative that we call
Graph-Augmented Multi-Layer Perceptrons (GA-MLPs), which first augments node
features with certain multi-hop operators on the graph and then applies an MLP
in a node-wise fashion. From the perspective of graph isomorphism testing, we
show both theoretically and numerically that GA-MLPs with suitable operators
can distinguish almost all non-isomorphic graphs, just like the
Weifeiler-Lehman (WL) test. However, by viewing them as node-level functions
and examining the equivalence classes they induce on rooted graphs, we prove a
separation in expressive power between GA-MLPs and GNNs that grows
exponentially in depth. In particular, unlike GNNs, GA-MLPs are unable to count
the number of attributed walks. We also demonstrate via community detection
experiments that GA-MLPs can be limited by their choice of operator family, as
compared to GNNs with higher flexibility in learning.
Related papers
- KAGNNs: Kolmogorov-Arnold Networks meet Graph Learning [27.638009679134523]
Graph Neural Networks (GNNs) have become the de facto tool for learning node and graph representations.
In this work, we compare the performance of Kolmogorov-Arnold Networks (KANs) against that of theorems in graph learning tasks.
arXiv Detail & Related papers (2024-06-26T14:21:21Z) - Uplifting the Expressive Power of Graph Neural Networks through Graph
Partitioning [3.236774847052122]
We study the expressive power of graph neural networks through the lens of graph partitioning.
We introduce a novel GNN architecture, namely Graph Partitioning Neural Networks (GPNNs)
arXiv Detail & Related papers (2023-12-14T06:08:35Z) - VQGraph: Rethinking Graph Representation Space for Bridging GNNs and
MLPs [97.63412451659826]
VQGraph learns a structure-aware tokenizer on graph data that can encode each node's local substructure as a discrete code.
VQGraph achieves new state-of-the-art performance on GNN-to-MLP distillation in both transductive and inductive settings.
arXiv Detail & Related papers (2023-08-04T02:58:08Z) - Graph Neural Networks are Inherently Good Generalizers: Insights by
Bridging GNNs and MLPs [71.93227401463199]
This paper pinpoints the major source of GNNs' performance gain to their intrinsic capability, by introducing an intermediate model class dubbed as P(ropagational)MLP.
We observe that PMLPs consistently perform on par with (or even exceed) their GNN counterparts, while being much more efficient in training.
arXiv Detail & Related papers (2022-12-18T08:17:32Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Graph Neural Networks with Parallel Neighborhood Aggregations for Graph
Classification [14.112444998191698]
We focus on graph classification using a graph neural network (GNN) model that precomputes the node features using a bank of neighborhood aggregation graph operators arranged in parallel.
These GNN models have a natural advantage of reduced training and inference time due to the precomputations.
We demonstrate via numerical experiments that the developed model achieves state-of-the-art performance on many diverse real-world datasets.
arXiv Detail & Related papers (2021-11-22T19:19:40Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.