LON-GNN: Spectral GNNs with Learnable Orthonormal Basis
- URL: http://arxiv.org/abs/2303.13750v2
- Date: Thu, 30 Mar 2023 02:25:54 GMT
- Title: LON-GNN: Spectral GNNs with Learnable Orthonormal Basis
- Authors: Qian Tao, Zhen Wang, Wenyuan Yu, Yaliang Li, Zhewei Wei
- Abstract summary: In recent years, a plethora of spectral graph neural networks (GNN) methods have utilized basis with learnable coefficients to achieve top-tier performances on many node-level tasks.
We identify the so-called over-passing issue of these methods and show that it is somewhat rooted in their less-called regularization strategy and unnormalized basis.
We design a novel spectral GNN, LON-GNN, with Learnable OrthoNormals and bases prove that regularizing coefficients becomes equivalent to regularizing the norm of learned filter function now.
- Score: 40.78119135344598
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, a plethora of spectral graph neural networks (GNN) methods
have utilized polynomial basis with learnable coefficients to achieve top-tier
performances on many node-level tasks. Although various kinds of polynomial
bases have been explored, each such method adopts a fixed polynomial basis
which might not be the optimal choice for the given graph. Besides, we identify
the so-called over-passing issue of these methods and show that it is somewhat
rooted in their less-principled regularization strategy and unnormalized basis.
In this paper, we make the first attempts to address these two issues.
Leveraging Jacobi polynomials, we design a novel spectral GNN, LON-GNN, with
Learnable OrthoNormal bases and prove that regularizing coefficients becomes
equivalent to regularizing the norm of learned filter function now. We conduct
extensive experiments on diverse graph datasets to evaluate the fitting and
generalization capability of LON-GNN, where the results imply its superiority.
Related papers
- Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - Generalized Learning of Coefficients in Spectral Graph Convolutional Networks [5.5711773076846365]
Spectral Graph Convolutional Networks (GCNs) have gained popularity in graph machine learning applications.
G-Arnoldi-GCN consistently outperforms state-of-the-art methods when suitable functions are employed.
arXiv Detail & Related papers (2024-09-07T12:53:44Z) - GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Simplifying GNN Performance with Low Rank Kernel Models [14.304623719903972]
We revisit recent spectral GNN approaches to semi-supervised node classification (SSNC)
We show that recent performance improvements in GNN approaches may be partially attributed to shifts in evaluation conventions.
arXiv Detail & Related papers (2023-10-08T17:56:30Z) - Graph Neural Networks with Learnable and Optimal Polynomial Bases [14.8308791628821]
Polynomial filters, a kind of Graph Networks, typically use a predetermined basis and learn the coefficients from the training data.
Can we learn a suitable basis from the training data?
Can we determine the optimal basis for a given graph and a node?
arXiv Detail & Related papers (2023-02-24T03:24:04Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - How Powerful are Spectral Graph Neural Networks [9.594432031144715]
Spectral Graph Neural Network is a kind of Graph Neural Network based on graph signal filters.
We first prove that even spectral GNNs without nonlinearity can produce arbitrary graph signals.
We also establish a connection between the expressive power of spectral GNNs and Graph Isomorphism (GI) testing.
arXiv Detail & Related papers (2022-05-23T10:22:12Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.