Breaking the Limits of Message Passing Graph Neural Networks
- URL: http://arxiv.org/abs/2106.04319v1
- Date: Tue, 8 Jun 2021 13:26:56 GMT
- Title: Breaking the Limits of Message Passing Graph Neural Networks
- Authors: Muhammet Balcilar, Pierre H\'eroux, Benoit Ga\"uz\`ere, Pascal
Vasseur, S\'ebastien Adam, Paul Honeine
- Abstract summary: Graph Neural Networks (MPNNs) have a linear complexity with respect to the number of nodes when applied to sparse graphs.
In this paper, we show that if the graph convolution supports are designed in spectral-domain by a non-linear custom function of eigenvalues, the MPNN is theoretically more powerful than the 1-WL test.
- Score: 6.175401630947573
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Since the Message Passing (Graph) Neural Networks (MPNNs) have a linear
complexity with respect to the number of nodes when applied to sparse graphs,
they have been widely implemented and still raise a lot of interest even though
their theoretical expressive power is limited to the first order
Weisfeiler-Lehman test (1-WL). In this paper, we show that if the graph
convolution supports are designed in spectral-domain by a non-linear custom
function of eigenvalues and masked with an arbitrary large receptive field, the
MPNN is theoretically more powerful than the 1-WL test and experimentally as
powerful as a 3-WL existing models, while remaining spatially localized.
Moreover, by designing custom filter functions, outputs can have various
frequency components that allow the convolution process to learn different
relationships between a given input graph signal and its associated properties.
So far, the best 3-WL equivalent graph neural networks have a computational
complexity in $\mathcal{O}(n^3)$ with memory usage in $\mathcal{O}(n^2)$,
consider non-local update mechanism and do not provide the spectral richness of
output profile. The proposed method overcomes all these aforementioned problems
and reaches state-of-the-art results in many downstream tasks.
Related papers
- GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Non-convolutional Graph Neural Networks [46.79328529882998]
We design a simple graph learning module entirely free of convolution operators, coined random walk with unifying memory (RUM) neural network.
RUM achieves competitive performance, but is also robust, memory-efficient, scalable, and faster than the simplest convolutional GNNs.
arXiv Detail & Related papers (2024-07-31T21:29:26Z) - Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - Higher-order Sparse Convolutions in Graph Neural Networks [17.647346486710514]
We introduce a new higher-order sparse convolution based on the Sobolev norm of graph signals.
S-SobGNN shows competitive performance in all applications as compared to several state-of-the-art methods.
arXiv Detail & Related papers (2023-02-21T08:08:18Z) - Exponentially Improving the Complexity of Simulating the
Weisfeiler-Lehman Test with Graph Neural Networks [22.36443060418036]
We show that the expressive power of Graph Neural Networks (GNNs) in distinguishing non-isomorphic graphs is exactly the same as that of the Weisfeiler-Lehman graph test.
We present an improved simulation of the WL test on GNNs with emphexponentially lower complexity.
arXiv Detail & Related papers (2022-11-06T22:38:49Z) - From Local to Global: Spectral-Inspired Graph Neural Networks [28.858773653743075]
Graph Neural Networks (GNNs) are powerful deep learning methods for Non-Euclidean data.
MPNNs are message-passing algorithms that aggregate and combine signals in a local graph neighborhood.
MPNNs can suffer from issues like over-smoothing or over-squashing.
arXiv Detail & Related papers (2022-09-24T17:19:00Z) - How Powerful are Spectral Graph Neural Networks [9.594432031144715]
Spectral Graph Neural Network is a kind of Graph Neural Network based on graph signal filters.
We first prove that even spectral GNNs without nonlinearity can produce arbitrary graph signals.
We also establish a connection between the expressive power of spectral GNNs and Graph Isomorphism (GI) testing.
arXiv Detail & Related papers (2022-05-23T10:22:12Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.