Graph Neural Networks with Learnable and Optimal Polynomial Bases
- URL: http://arxiv.org/abs/2302.12432v2
- Date: Sat, 1 Jul 2023 03:09:13 GMT
- Title: Graph Neural Networks with Learnable and Optimal Polynomial Bases
- Authors: Yuhe Guo and Zhewei Wei
- Abstract summary: Polynomial filters, a kind of Graph Networks, typically use a predetermined basis and learn the coefficients from the training data.
Can we learn a suitable basis from the training data?
Can we determine the optimal basis for a given graph and a node?
- Score: 14.8308791628821
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Polynomial filters, a kind of Graph Neural Networks, typically use a
predetermined polynomial basis and learn the coefficients from the training
data. It has been observed that the effectiveness of the model is highly
dependent on the property of the polynomial basis. Consequently, two natural
and fundamental questions arise: Can we learn a suitable polynomial basis from
the training data? Can we determine the optimal polynomial basis for a given
graph and node features?
In this paper, we propose two spectral GNN models that provide positive
answers to the questions posed above. First, inspired by Favard's Theorem, we
propose the FavardGNN model, which learns a polynomial basis from the space of
all possible orthonormal bases. Second, we examine the supposedly unsolvable
definition of optimal polynomial basis from Wang & Zhang (2022) and propose a
simple model, OptBasisGNN, which computes the optimal basis for a given graph
structure and graph signal. Extensive experiments are conducted to demonstrate
the effectiveness of our proposed models. Our code is available at
https://github.com/yuziGuo/FarOptBasis.
Related papers
- The Heterophilic Snowflake Hypothesis: Training and Empowering GNNs for Heterophilic Graphs [59.03660013787925]
We introduce the Heterophily Snowflake Hypothesis and provide an effective solution to guide and facilitate research on heterophilic graphs.
Our observations show that our framework acts as a versatile operator for diverse tasks.
It can be integrated into various GNN frameworks, boosting performance in-depth and offering an explainable approach to choosing the optimal network depth.
arXiv Detail & Related papers (2024-06-18T12:16:00Z) - GINN-LP: A Growing Interpretable Neural Network for Discovering
Multivariate Laurent Polynomial Equations [1.1142444517901018]
We propose GINN-LP, an interpretable neural network, to discover the form of a Laurent Polynomial equation.
To the best of our knowledge, this is the first neural network that can discover arbitrary terms without any prior information on the order.
We show that GINN-LP outperforms the state-of-theart symbolic regression methods on datasets.
arXiv Detail & Related papers (2023-12-18T03:44:29Z) - An Effective Universal Polynomial Basis for Spectral Graph Neural
Networks [12.725906836609811]
Spectral Graph Neural Networks (GNNs) have gained increasing prevalence for heterophily graphs.
We develop an adaptive heterophily basis by incorporating graph heterophily degrees.
We then integrate this heterophily basis with the homophily basis, creating a universal basis UniBasis.
arXiv Detail & Related papers (2023-11-30T01:48:42Z) - LON-GNN: Spectral GNNs with Learnable Orthonormal Basis [40.78119135344598]
In recent years, a plethora of spectral graph neural networks (GNN) methods have utilized basis with learnable coefficients to achieve top-tier performances on many node-level tasks.
We identify the so-called over-passing issue of these methods and show that it is somewhat rooted in their less-called regularization strategy and unnormalized basis.
We design a novel spectral GNN, LON-GNN, with Learnable OrthoNormals and bases prove that regularizing coefficients becomes equivalent to regularizing the norm of learned filter function now.
arXiv Detail & Related papers (2023-03-24T02:07:46Z) - Graph Polynomial Convolution Models for Node Classification of
Non-Homophilous Graphs [52.52570805621925]
We investigate efficient learning from higher-order graph convolution and learning directly from adjacency matrix for node classification.
We show that the resulting model lead to new graphs and residual scaling parameter.
We demonstrate that the proposed methods obtain improved accuracy for node-classification of non-homophilous parameters.
arXiv Detail & Related papers (2022-09-12T04:46:55Z) - Convolutional Neural Networks on Graphs with Chebyshev Approximation,
Revisited [80.29747781984135]
ChebNet is one of the early attempts to approximate the spectral graph convolutions using Chebyshevs.
GCN simplifies ChebNet by utilizing only the first two Chebyshevs while still outperforming it on real-world datasets.
We propose ChebNetII, a new GNN model based on Chebyshev, which enhances the original Chebyshev approximation while reducing the Runge phenomenon.
arXiv Detail & Related papers (2022-02-04T09:11:13Z) - A Piece-wise Polynomial Filtering Approach for Graph Neural Networks [0.45298395481707365]
Graph Neural Networks (GNNs) exploit signals from node features and the input graph topology to improve node classification task performance.
These models tend to perform poorly on heterophilic graphs, where connected nodes have different labels.
We show that our model achieves performance gains of up to 5% over the state-of-theart models and outperforms existing filter-based approaches in general.
arXiv Detail & Related papers (2021-12-07T05:16:53Z) - Graph Entropy Guided Node Embedding Dimension Selection for Graph Neural
Networks [74.26734952400925]
We propose a novel Minimum Graph Entropy (MinGE) algorithm for Node Embedding Dimension Selection (NEDS)
MinGE considers both feature entropy and structure entropy on graphs, which are carefully designed according to the characteristics of the rich information in them.
Experiments with popular Graph Neural Networks (GNNs) on benchmark datasets demonstrate the effectiveness and generalizability of our proposed MinGE.
arXiv Detail & Related papers (2021-05-07T11:40:29Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Stacked Graph Filter [19.343260981528186]
We study Graph Convolutional Networks (GCN) from the graph signal processing viewpoint.
We find that by stacking graph filters with learnable solution parameters, we can build a highly adaptive and robust graph classification model.
arXiv Detail & Related papers (2020-11-22T11:20:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.