From ChebNet to ChebGibbsNet
- URL: http://arxiv.org/abs/2412.01789v1
- Date: Mon, 02 Dec 2024 18:37:45 GMT
- Title: From ChebNet to ChebGibbsNet
- Authors: Jie Zhang, Min-Te Sun,
- Abstract summary: We analyze the potential of Spectral Graph Convolutional Networks (SpecGCNs)
We show that ChebNet is superior to other advanced SpecGCNs, such as GPR-GNN and BernNet, in both homogeneous graphs and heterogeneous graphs.
Our experiments indicate that ChebGibbsNet is superior to other advanced SpecGCNs, such as GPR-GNN and BernNet, in both homogeneous graphs and heterogeneous graphs.
- Score: 4.600280055873605
- License:
- Abstract: Recent advancements in Spectral Graph Convolutional Networks (SpecGCNs) have led to state-of-the-art performance in various graph representation learning tasks. To exploit the potential of SpecGCNs, we analyze corresponding graph filters via polynomial interpolation, the cornerstone of graph signal processing. Different polynomial bases, such as Bernstein, Chebyshev, and monomial basis, have various convergence rates that will affect the error in polynomial interpolation. Although adopting Chebyshev basis for interpolation can minimize maximum error, the performance of ChebNet is still weaker than GPR-GNN and BernNet. \textbf{We point out it is caused by the Gibbs phenomenon, which occurs when the graph frequency response function approximates the target function.} It reduces the approximation ability of a truncated polynomial interpolation. In order to mitigate the Gibbs phenomenon, we propose to add the Gibbs damping factor with each term of Chebyshev polynomials on ChebNet. As a result, our lightweight approach leads to a significant performance boost. Afterwards, we reorganize ChebNet via decoupling feature propagation and transformation. We name this variant as \textbf{ChebGibbsNet}. Our experiments indicate that ChebGibbsNet is superior to other advanced SpecGCNs, such as GPR-GNN and BernNet, in both homogeneous graphs and heterogeneous graphs.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Polynomial Selection in Spectral Graph Neural Networks: An Error-Sum of Function Slices Approach [26.79625547648669]
Spectral graph networks are proposed to harness spectral information inherent in graph neural data through the application of graph filters.
We show that various choices greatly impact spectral GNN performance, underscoring the importance of parameter selection.
We develop an advanced filter based on trigonometrics, a widely adopted option for approxing narrow signal slices.
arXiv Detail & Related papers (2024-04-15T11:35:32Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Convolutional Neural Networks on Graphs with Chebyshev Approximation,
Revisited [80.29747781984135]
ChebNet is one of the early attempts to approximate the spectral graph convolutions using Chebyshevs.
GCN simplifies ChebNet by utilizing only the first two Chebyshevs while still outperforming it on real-world datasets.
We propose ChebNetII, a new GNN model based on Chebyshev, which enhances the original Chebyshev approximation while reducing the Runge phenomenon.
arXiv Detail & Related papers (2022-02-04T09:11:13Z) - Node Feature Kernels Increase Graph Convolutional Network Robustness [19.076912727990326]
The robustness of Graph Convolutional Networks (GCNs) to perturbations of their input is becoming a topic of increasing importance.
In this paper, a random matrix theory analysis is possible.
It is observed that enhancing the message passing step in GCNs by adding the node feature kernel to the adjacency matrix of the graph structure solves this problem.
arXiv Detail & Related papers (2021-09-04T04:20:45Z) - Multi-hop Graph Convolutional Network with High-order Chebyshev
Approximation for Text Reasoning [15.65069702939315]
We define the spectral graph convolutional network with the high-order dynamic Chebyshev approximation (HDGCN)
To alleviate the over-smoothing in high-order Chebyshev approximation, a multi-vote-based cross-attention (MVCAttn) with linear computation complexity is also proposed.
arXiv Detail & Related papers (2021-06-08T07:49:43Z) - Factorizable Graph Convolutional Networks [90.59836684458905]
We introduce a novel graph convolutional network (GCN) that explicitly disentangles intertwined relations encoded in a graph.
FactorGCN takes a simple graph as input, and disentangles it into several factorized graphs.
We evaluate the proposed FactorGCN both qualitatively and quantitatively on the synthetic and real-world datasets.
arXiv Detail & Related papers (2020-10-12T03:01:40Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.