Benchmarking Spectral Graph Neural Networks: A Comprehensive Study on Effectiveness and Efficiency
- URL: http://arxiv.org/abs/2406.09675v1
- Date: Fri, 14 Jun 2024 02:56:57 GMT
- Title: Benchmarking Spectral Graph Neural Networks: A Comprehensive Study on Effectiveness and Efficiency
- Authors: Ningyi Liao, Haoyu Liu, Zulun Zhu, Siqiang Luo, Laks V. S. Lakshmanan,
- Abstract summary: We extensively benchmark spectral GNNs with a focus on the frequency perspective.
We implement these spectral models under a unified framework with dedicated graph computations and efficient training schemes.
Our implementation enables application on larger graphs with comparable performance and less overhead.
- Score: 20.518170371888075
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: With the recent advancements in graph neural networks (GNNs), spectral GNNs have received increasing popularity by virtue of their specialty in capturing graph signals in the frequency domain, demonstrating promising capability in specific tasks. However, few systematic studies have been conducted on assessing their spectral characteristics. This emerging family of models also varies in terms of designs and settings, leading to difficulties in comparing their performance and deciding on the suitable model for specific scenarios, especially for large-scale tasks. In this work, we extensively benchmark spectral GNNs with a focus on the frequency perspective. We analyze and categorize over 30 GNNs with 27 corresponding filters. Then, we implement these spectral models under a unified framework with dedicated graph computations and efficient training schemes. Thorough experiments are conducted on the spectral models with inclusive metrics on effectiveness and efficiency, offering practical guidelines on evaluating and selecting spectral GNNs with desirable performance. Our implementation enables application on larger graphs with comparable performance and less overhead, which is available at: https://github.com/gdmnl/Spectral-GNN-Benchmark.
Related papers
- Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report [21.288230563135055]
We propose a novel graph spectral sparsification method to approximate the propagation patterns of spectral Graph Neural Networks (GNNs)
Our method allows the application of linear layers on the input node features, enabling end-to-end training as well as the handling of raw features.
arXiv Detail & Related papers (2025-01-08T15:36:19Z) - Revisiting Graph Neural Networks on Graph-level Tasks: Comprehensive Experiments, Analysis, and Improvements [54.006506479865344]
We propose a unified evaluation framework for graph-level Graph Neural Networks (GNNs)
This framework provides a standardized setting to evaluate GNNs across diverse datasets.
We also propose a novel GNN model with enhanced expressivity and generalization capabilities.
arXiv Detail & Related papers (2025-01-01T08:48:53Z) - AutoSGNN: Automatic Propagation Mechanism Discovery for Spectral Graph Neural Networks [6.755403881158429]
We propose AutoSGNN, an automated framework for discovering propagation mechanisms in spectral GNNs.
We show that AutoSGNN outperforms state-of-the-art spectral GNNs and graph neural architecture search methods in both performance and efficiency.
arXiv Detail & Related papers (2024-12-17T02:37:48Z) - Graph Neural Networks Are More Than Filters: Revisiting and Benchmarking from A Spectral Perspective [49.613774305350084]
Graph Neural Networks (GNNs) have achieved remarkable success in various graph-based learning tasks.
Recent studies suggest that other components such as non-linear layers may also significantly affect how GNNs process the input graph data in the spectral domain.
This paper introduces a comprehensive benchmark to measure and evaluate GNNs' capability in capturing and leveraging the information encoded in different frequency components of the input graph data.
arXiv Detail & Related papers (2024-12-10T04:53:53Z) - GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Characterizing the Efficiency of Graph Neural Network Frameworks with a
Magnifying Glass [10.839902229218577]
Graph neural networks (GNNs) have received great attention due to their success in various graph-related learning tasks.
Recent GNNs have been developed with different graph sampling techniques for mini-batch training of GNNs on large graphs.
It is unknown how much the frameworks are 'eco-friendly' from a green computing perspective.
arXiv Detail & Related papers (2022-11-06T04:22:19Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Attention-Based Recommendation On Graphs [9.558392439655012]
Graph Neural Networks (GNN) have shown remarkable performance in different tasks.
In this study, we propose GARec as a model-based recommender system.
The presented method outperforms existing model-based, non-graph neural networks and graph neural networks in different MovieLens datasets.
arXiv Detail & Related papers (2022-01-04T21:02:02Z) - Spectral Graph Attention Network with Fast Eigen-approximation [103.93113062682633]
Spectral Graph Attention Network (SpGAT) learns representations for different frequency components regarding weighted filters and graph wavelets bases.
Fast approximation variant SpGAT-Cheby is proposed to reduce the computational cost brought by the eigen-decomposition.
We thoroughly evaluate the performance of SpGAT and SpGAT-Cheby in semi-supervised node classification tasks.
arXiv Detail & Related papers (2020-03-16T21:49:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.