Improving Spectral Graph Convolution for Learning Graph-level
Representation
- URL: http://arxiv.org/abs/2112.07160v1
- Date: Tue, 14 Dec 2021 04:50:46 GMT
- Title: Improving Spectral Graph Convolution for Learning Graph-level
Representation
- Authors: Mingqi Yang, Rui Li, Yanming Shen, Heng Qi, Baocai Yin
- Abstract summary: We show that for learning representations of the entire graphs, the topological distance seems necessary since it characterizes the basic relations between nodes.
By removing it, as well as the limitation of graph filters, the resulting new architecture significantly boosts performance on learning graph representations.
It serves as an understanding that quantitatively measures the effects of the spectrum to input signals in comparison to the well-known spectral/low-pass filters.
- Score: 27.76697047602983
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: From the original theoretically well-defined spectral graph convolution to
the subsequent spatial bassed message-passing model, spatial locality (in
vertex domain) acts as a fundamental principle of most graph neural networks
(GNNs). In the spectral graph convolution, the filter is approximated by
polynomials, where a $k$-order polynomial covers $k$-hop neighbors. In the
message-passing, various definitions of neighbors used in aggregations are
actually an extensive exploration of the spatial locality information. For
learning node representations, the topological distance seems necessary since
it characterizes the basic relations between nodes. However, for learning
representations of the entire graphs, is it still necessary to hold? In this
work, we show that such a principle is not necessary, it hinders most existing
GNNs from efficiently encoding graph structures. By removing it, as well as the
limitation of polynomial filters, the resulting new architecture significantly
boosts performance on learning graph representations. We also study the effects
of graph spectrum on signals and interpret various existing improvements as
different spectrum smoothing techniques. It serves as a spatial understanding
that quantitatively measures the effects of the spectrum to input signals in
comparison to the well-known spectral understanding as high/low-pass filters.
More importantly, it sheds the light on developing powerful graph
representation models.
Related papers
- GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - How Powerful are Spectral Graph Neural Networks [9.594432031144715]
Spectral Graph Neural Network is a kind of Graph Neural Network based on graph signal filters.
We first prove that even spectral GNNs without nonlinearity can produce arbitrary graph signals.
We also establish a connection between the expressive power of spectral GNNs and Graph Isomorphism (GI) testing.
arXiv Detail & Related papers (2022-05-23T10:22:12Z) - Pointspectrum: Equivariance Meets Laplacian Filtering for Graph
Representation Learning [3.7875603451557063]
Graph Representation Learning (GRL) has become essential for modern graph data mining and learning tasks.
While Graph Neural Networks (GNNs) have been used in state-of-the-art GRL architectures, they have been shown to suffer from over smoothing.
We propose PointSpectrum, a spectral method that incorporates a set equivariant network to account for a graph's structure.
arXiv Detail & Related papers (2021-09-06T10:59:11Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Graph Networks with Spectral Message Passing [1.0742675209112622]
We introduce the Spectral Graph Network, which applies message passing to both the spatial and spectral domains.
Our results show that the Spectral GN promotes efficient training, reaching high performance with fewer training iterations despite having more parameters.
arXiv Detail & Related papers (2020-12-31T21:33:17Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.