Specformer: Spectral Graph Neural Networks Meet Transformers
- URL: http://arxiv.org/abs/2303.01028v1
- Date: Thu, 2 Mar 2023 07:36:23 GMT
- Title: Specformer: Spectral Graph Neural Networks Meet Transformers
- Authors: Deyu Bo and Chuan Shi and Lele Wang and Renjie Liao
- Abstract summary: Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
- Score: 51.644312964537356
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spectral graph neural networks (GNNs) learn graph representations via
spectral-domain graph convolutions. However, most existing spectral graph
filters are scalar-to-scalar functions, i.e., mapping a single eigenvalue to a
single filtered value, thus ignoring the global pattern of the spectrum.
Furthermore, these filters are often constructed based on some fixed-order
polynomials, which have limited expressiveness and flexibility. To tackle these
issues, we introduce Specformer, which effectively encodes the set of all
eigenvalues and performs self-attention in the spectral domain, leading to a
learnable set-to-set spectral filter. We also design a decoder with learnable
bases to enable non-local graph convolution. Importantly, Specformer is
equivariant to permutation. By stacking multiple Specformer layers, one can
build a powerful spectral GNN. On synthetic datasets, we show that our
Specformer can better recover ground-truth spectral filters than other spectral
GNNs. Extensive experiments of both node-level and graph-level tasks on
real-world graph datasets show that our Specformer outperforms state-of-the-art
GNNs and learns meaningful spectrum patterns. Code and data are available at
https://github.com/bdy9527/Specformer.
Related papers
- GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Advancing Graph Convolutional Networks via General Spectral Wavelets [41.41593198072709]
We present a novel wavelet-based graph convolution network, namely WaveGC, which integrates multi-resolution spectral bases and a matrix-valued filter kernel.
Theoretically, we establish that WaveGC can effectively capture and decouple short-range and long-range information, providing superior filtering flexibility.
arXiv Detail & Related papers (2024-05-22T16:32:27Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - How Powerful are Spectral Graph Neural Networks [9.594432031144715]
Spectral Graph Neural Network is a kind of Graph Neural Network based on graph signal filters.
We first prove that even spectral GNNs without nonlinearity can produce arbitrary graph signals.
We also establish a connection between the expressive power of spectral GNNs and Graph Isomorphism (GI) testing.
arXiv Detail & Related papers (2022-05-23T10:22:12Z) - Improving Spectral Graph Convolution for Learning Graph-level
Representation [27.76697047602983]
We show that for learning representations of the entire graphs, the topological distance seems necessary since it characterizes the basic relations between nodes.
By removing it, as well as the limitation of graph filters, the resulting new architecture significantly boosts performance on learning graph representations.
It serves as an understanding that quantitatively measures the effects of the spectrum to input signals in comparison to the well-known spectral/low-pass filters.
arXiv Detail & Related papers (2021-12-14T04:50:46Z) - Pointspectrum: Equivariance Meets Laplacian Filtering for Graph
Representation Learning [3.7875603451557063]
Graph Representation Learning (GRL) has become essential for modern graph data mining and learning tasks.
While Graph Neural Networks (GNNs) have been used in state-of-the-art GRL architectures, they have been shown to suffer from over smoothing.
We propose PointSpectrum, a spectral method that incorporates a set equivariant network to account for a graph's structure.
arXiv Detail & Related papers (2021-09-06T10:59:11Z) - BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein
Approximation [47.88982193039535]
We propose $textitBernNet$, a novel graph neural network with theoretical support that provides a simple but effective scheme for designing and learning arbitrary graph spectral filters.
In particular, for any filter over the normalized Laplacian spectrum of a graph, our BernNet estimates it by an order-$K$ Bernstein approximation and designs its spectral property by setting the coefficients of the Bernstein basis.
Our experiments demonstrate that BernNet can learn arbitrary spectral filters, including complicated band-rejection and comb filters, and it achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2021-06-21T11:26:06Z) - Message Passing in Graph Convolution Networks via Adaptive Filter Banks [81.12823274576274]
We present a novel graph convolution operator, termed BankGCN.
It decomposes multi-channel signals on graphs into subspaces and handles particular information in each subspace with an adapted filter.
It achieves excellent performance in graph classification on a collection of benchmark graph datasets.
arXiv Detail & Related papers (2021-06-18T04:23:34Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.