Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report
- URL: http://arxiv.org/abs/2501.04570v1
- Date: Wed, 08 Jan 2025 15:36:19 GMT
- Title: Large-Scale Spectral Graph Neural Networks via Laplacian Sparsification: Technical Report
- Authors: Haipeng Ding, Zhewei Wei, Yuhang Ye,
- Abstract summary: We propose a novel graph spectral sparsification method to approximate the propagation patterns of spectral Graph Neural Networks (GNNs)
Our method allows the application of linear layers on the input node features, enabling end-to-end training as well as the handling of raw features.
- Score: 21.288230563135055
- License:
- Abstract: Graph Neural Networks (GNNs) play a pivotal role in graph-based tasks for their proficiency in representation learning. Among the various GNN methods, spectral GNNs employing polynomial filters have shown promising performance on tasks involving both homophilous and heterophilous graph structures. However, The scalability of spectral GNNs on large graphs is limited because they learn the polynomial coefficients through multiple forward propagation executions during forward propagation. Existing works have attempted to scale up spectral GNNs by eliminating the linear layers on the input node features, a change that can disrupt end-to-end training, potentially impact performance, and become impractical with high-dimensional input features. To address the above challenges, we propose "Spectral Graph Neural Networks with Laplacian Sparsification (SGNN-LS)", a novel graph spectral sparsification method to approximate the propagation patterns of spectral GNNs. We prove that our proposed method generates Laplacian sparsifiers that can approximate both fixed and learnable polynomial filters with theoretical guarantees. Our method allows the application of linear layers on the input node features, enabling end-to-end training as well as the handling of raw text features. We conduct an extensive experimental analysis on datasets spanning various graph scales and properties to demonstrate the superior efficiency and effectiveness of our method. The results show that our method yields superior results in comparison with the corresponding approximated base models, especially on dataset Ogbn-papers100M(111M nodes, 1.6B edges) and MAG-scholar-C (2.8M features).
Related papers
- A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Polynomial Selection in Spectral Graph Neural Networks: An Error-Sum of Function Slices Approach [26.79625547648669]
Spectral graph networks are proposed to harness spectral information inherent in graph neural data through the application of graph filters.
We show that various choices greatly impact spectral GNN performance, underscoring the importance of parameter selection.
We develop an advanced filter based on trigonometrics, a widely adopted option for approxing narrow signal slices.
arXiv Detail & Related papers (2024-04-15T11:35:32Z) - Spectral Heterogeneous Graph Convolutions via Positive Noncommutative Polynomials [34.74726720818622]
We present Positive Spectral Heterogeneous Graph Convolutional Network (PSHGCN)
PSHGCN offers a simple yet effective method for learning valid heterogeneous graph filters.
PSHGCN exhibits remarkable scalability, efficiently handling large real-world graphs comprising millions of nodes and edges.
arXiv Detail & Related papers (2023-05-31T14:09:42Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - How Powerful are Spectral Graph Neural Networks [9.594432031144715]
Spectral Graph Neural Network is a kind of Graph Neural Network based on graph signal filters.
We first prove that even spectral GNNs without nonlinearity can produce arbitrary graph signals.
We also establish a connection between the expressive power of spectral GNNs and Graph Isomorphism (GI) testing.
arXiv Detail & Related papers (2022-05-23T10:22:12Z) - Pointspectrum: Equivariance Meets Laplacian Filtering for Graph
Representation Learning [3.7875603451557063]
Graph Representation Learning (GRL) has become essential for modern graph data mining and learning tasks.
While Graph Neural Networks (GNNs) have been used in state-of-the-art GRL architectures, they have been shown to suffer from over smoothing.
We propose PointSpectrum, a spectral method that incorporates a set equivariant network to account for a graph's structure.
arXiv Detail & Related papers (2021-09-06T10:59:11Z) - Graph Feature Gating Networks [31.20878472589719]
We propose a general graph feature gating network (GFGN) based on the graph signal denoising problem.
We also introduce three graph filters under GFGN to allow different levels of contributions from feature dimensions.
arXiv Detail & Related papers (2021-05-10T16:33:58Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.