A Survey on Spectral Graph Neural Networks
- URL: http://arxiv.org/abs/2302.05631v1
- Date: Sat, 11 Feb 2023 09:16:46 GMT
- Title: A Survey on Spectral Graph Neural Networks
- Authors: Deyu Bo, Xiao Wang, Yang Liu, Yuan Fang, Yawen Li, Chuan Shi
- Abstract summary: We summarize the recent development of spectral GNNs, including model, theory, and application.
We first discuss the connection between spatial GNNs and spectral GNNs, which shows that spectral GNNs can capture global information and have better interpretability.
In addition, we review major theoretical results and applications of spectral GNNs, followed by a quantitative experiment to benchmark some popular spectral GNNs.
- Score: 42.469584005389414
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) have attracted considerable attention from the
research community. It is well established that GNNs are usually roughly
divided into spatial and spectral methods. Despite that spectral GNNs play an
important role in both graph signal processing and graph representation
learning, existing studies are biased toward spatial approaches, and there is
no comprehensive review on spectral GNNs so far. In this paper, we summarize
the recent development of spectral GNNs, including model, theory, and
application. Specifically, we first discuss the connection between spatial GNNs
and spectral GNNs, which shows that spectral GNNs can capture global
information and have better expressiveness and interpretability. Next, we
categorize existing spectral GNNs according to the spectrum information they
use, \ie, eigenvalues or eigenvectors. In addition, we review major theoretical
results and applications of spectral GNNs, followed by a quantitative
experiment to benchmark some popular spectral GNNs. Finally, we conclude the
paper with some future directions.
Related papers
- Benchmarking Spectral Graph Neural Networks: A Comprehensive Study on Effectiveness and Efficiency [20.518170371888075]
We extensively benchmark spectral GNNs with a focus on the frequency perspective.
We implement these spectral models under a unified framework with dedicated graph computations and efficient training schemes.
Our implementation enables application on larger graphs with comparable performance and less overhead.
arXiv Detail & Related papers (2024-06-14T02:56:57Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Rethinking Spectral Graph Neural Networks with Spatially Adaptive Filtering [31.595664867365322]
spectral Graph Neural Networks (GNNs) are well-founded in the spectral domain, but their practical reliance on approximation implies a profound linkage to the spatial domain.
We establish a theoretical connection between spectral and spatial aggregation, unveiling an intrinsic interaction that spectral implicitly leads the original graph to an adapted new graph.
We propose a novel Spatially Adaptive Filtering (SAF) framework, which leverages the adapted new graph by spectral filtering for an auxiliary non-local aggregation.
arXiv Detail & Related papers (2024-01-17T09:12:31Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Bridging the Gap between Spatial and Spectral Domains: A Unified
Framework for Graph Neural Networks [61.17075071853949]
Graph neural networks (GNNs) are designed to deal with graph-structural data that classical deep learning does not easily manage.
The purpose of this study is to establish a unified framework that integrates GNNs based on spectral graph and approximation theory.
arXiv Detail & Related papers (2021-07-21T17:34:33Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z) - Bridging the Gap between Spatial and Spectral Domains: A Survey on Graph
Neural Networks [52.76042362922247]
Graph neural networks (GNNs) are designed to handle the non-Euclidean graph-structure.
Existing GNNs are presented using various techniques, making direct comparison and cross-reference more complex.
We organize existing GNNs into spatial and spectral domains, as well as expose the connections within each domain.
arXiv Detail & Related papers (2020-02-27T01:15:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.