Graph Neural Networks Are More Than Filters: Revisiting and Benchmarking from A Spectral Perspective
- URL: http://arxiv.org/abs/2412.07188v1
- Date: Tue, 10 Dec 2024 04:53:53 GMT
- Title: Graph Neural Networks Are More Than Filters: Revisiting and Benchmarking from A Spectral Perspective
- Authors: Yushun Dong, Patrick Soga, Yinhan He, Song Wang, Jundong Li,
- Abstract summary: Graph Neural Networks (GNNs) have achieved remarkable success in various graph-based learning tasks.
Recent studies suggest that other components such as non-linear layers may also significantly affect how GNNs process the input graph data in the spectral domain.
This paper introduces a comprehensive benchmark to measure and evaluate GNNs' capability in capturing and leveraging the information encoded in different frequency components of the input graph data.
- Score: 49.613774305350084
- License:
- Abstract: Graph Neural Networks (GNNs) have achieved remarkable success in various graph-based learning tasks. While their performance is often attributed to the powerful neighborhood aggregation mechanism, recent studies suggest that other components such as non-linear layers may also significantly affecting how GNNs process the input graph data in the spectral domain. Such evidence challenges the prevalent opinion that neighborhood aggregation mechanisms dominate the behavioral characteristics of GNNs in the spectral domain. To demystify such a conflict, this paper introduces a comprehensive benchmark to measure and evaluate GNNs' capability in capturing and leveraging the information encoded in different frequency components of the input graph data. Specifically, we first conduct an exploratory study demonstrating that GNNs can flexibly yield outputs with diverse frequency components even when certain frequencies are absent or filtered out from the input graph data. We then formulate a novel research problem of measuring and benchmarking the performance of GNNs from a spectral perspective. To take an initial step towards a comprehensive benchmark, we design an evaluation protocol supported by comprehensive theoretical analysis. Finally, we introduce a comprehensive benchmark on real-world datasets, revealing insights that challenge prevalent opinions from a spectral perspective. We believe that our findings will open new avenues for future advancements in this area. Our implementations can be found at: https://github.com/yushundong/Spectral-benchmark.
Related papers
- Benchmarking Spectral Graph Neural Networks: A Comprehensive Study on Effectiveness and Efficiency [20.518170371888075]
We extensively benchmark spectral GNNs with a focus on the frequency perspective.
We implement these spectral models under a unified framework with dedicated graph computations and efficient training schemes.
Our implementation enables application on larger graphs with comparable performance and less overhead.
arXiv Detail & Related papers (2024-06-14T02:56:57Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - NoisyGL: A Comprehensive Benchmark for Graph Neural Networks under Label Noise [21.65452861777135]
Graph Neural Networks (GNNs) exhibit strong potential in node classification task through a message-passing mechanism.
Label noise is common in real-world graph data, negatively impacting GNNs by propagating incorrect information during training.
We introduce NoisyGL, the first comprehensive benchmark for graph neural networks under label noise.
arXiv Detail & Related papers (2024-06-06T17:45:00Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - A Survey on Spectral Graph Neural Networks [42.469584005389414]
We summarize the recent development of spectral GNNs, including model, theory, and application.
We first discuss the connection between spatial GNNs and spectral GNNs, which shows that spectral GNNs can capture global information and have better interpretability.
In addition, we review major theoretical results and applications of spectral GNNs, followed by a quantitative experiment to benchmark some popular spectral GNNs.
arXiv Detail & Related papers (2023-02-11T09:16:46Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - When Does A Spectral Graph Neural Network Fail in Node Classification? [10.177243505636692]
We conduct a theoretical analysis of spectral GNNs performance by investigating their prediction error.
We indicate that graph filters with low response efficiency on label difference are prone to fail.
arXiv Detail & Related papers (2022-02-16T07:18:17Z) - Graph Feature Gating Networks [31.20878472589719]
We propose a general graph feature gating network (GFGN) based on the graph signal denoising problem.
We also introduce three graph filters under GFGN to allow different levels of contributions from feature dimensions.
arXiv Detail & Related papers (2021-05-10T16:33:58Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.