An Effective Universal Polynomial Basis for Spectral Graph Neural
Networks
- URL: http://arxiv.org/abs/2311.18177v2
- Date: Tue, 5 Mar 2024 12:07:34 GMT
- Title: An Effective Universal Polynomial Basis for Spectral Graph Neural
Networks
- Authors: Keke Huang, Pietro Li\`o
- Abstract summary: Spectral Graph Neural Networks (GNNs) have gained increasing prevalence for heterophily graphs.
We develop an adaptive heterophily basis by incorporating graph heterophily degrees.
We then integrate this heterophily basis with the homophily basis, creating a universal basis UniBasis.
- Score: 12.725906836609811
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Spectral Graph Neural Networks (GNNs), also referred to as graph filters have
gained increasing prevalence for heterophily graphs. Optimal graph filters rely
on Laplacian eigendecomposition for Fourier transform. In an attempt to avert
the prohibitive computations, numerous polynomial filters by leveraging
distinct polynomials have been proposed to approximate the desired graph
filters. However, polynomials in the majority of polynomial filters are
predefined and remain fixed across all graphs, failing to accommodate the
diverse heterophily degrees across different graphs. To tackle this issue, we
first investigate the correlation between polynomial bases of desired graph
filters and the degrees of graph heterophily via a thorough theoretical
analysis. Afterward, we develop an adaptive heterophily basis by incorporating
graph heterophily degrees. Subsequently, we integrate this heterophily basis
with the homophily basis, creating a universal polynomial basis UniBasis. In
consequence, we devise a general polynomial filter UniFilter. Comprehensive
experiments on both real-world and synthetic datasets with varying heterophily
degrees significantly support the superiority of UniFilter, demonstrating the
effectiveness and generality of UniBasis, as well as its promising capability
as a new method for graph analysis.
Related papers
- Addressing Heterogeneity and Heterophily in Graphs: A Heterogeneous Heterophilic Spectral Graph Neural Network [48.05273145974434]
We propose a Heterogeneous Heterophilic Spectral Graph Neural Network (H2SGNN)
H2SGNN employs a dual-module approach: local independent filtering and global hybrid filtering.
Extensive empirical evaluations on four real-world datasets demonstrate the superiority of H2SGNN compared to state-of-the-art methods.
arXiv Detail & Related papers (2024-10-17T09:23:53Z) - GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Node-wise Filtering in Graph Neural Networks: A Mixture of Experts Approach [58.8524608686851]
Graph Neural Networks (GNNs) have proven to be highly effective for node classification tasks across diverse graph structural patterns.
Traditionally, GNNs employ a uniform global filter, typically a low-pass filter for homophilic graphs and a high-pass filter for heterophilic graphs.
We introduce a novel GNN framework Node-MoE that utilizes a mixture of experts to adaptively select the appropriate filters for different nodes.
arXiv Detail & Related papers (2024-06-05T17:12:38Z) - How Universal Polynomial Bases Enhance Spectral Graph Neural Networks: Heterophily, Over-smoothing, and Over-squashing [24.857304431611464]
Spectral Graph Networks (GNNs) have gained increasing prevalence for heterophily graphs.
In an attempt to avert prohibitive computations, numerous filters have been proposed.
We demystify the correlation between the spectral property of desired vectors and the heterophily degrees.
We develop a novel adaptive heterophily basis wherein the basis mutually form angles reflecting the heterophily degree of the graph.
arXiv Detail & Related papers (2024-05-21T03:28:45Z) - Optimizing Polynomial Graph Filters: A Novel Adaptive Krylov Subspace Approach [36.06398179717066]
We develop an adaptive graph filter based on Krylov subspaces to filter complex graphs.
We conduct extensive experiments across a series of real-world datasets.
arXiv Detail & Related papers (2024-03-12T06:26:17Z) - Improving Expressive Power of Spectral Graph Neural Networks with Eigenvalue Correction [55.57072563835959]
spectral graph neural networks are characterized by filters.
We propose an eigenvalue correction strategy that can free filters from the constraints of repeated eigenvalue inputs.
arXiv Detail & Related papers (2024-01-28T08:12:00Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Automated Polynomial Filter Learning for Graph Neural Networks [9.120531252536617]
Polynomial graph filters have been widely used as guiding principles in the design of Graph Neural Networks (GNNs)
Recently, the adaptive learning of the graph filters has demonstrated promising performance for modeling graph signals on both homophilic and heterophilic graphs.
We propose Auto-Polynomial, a novel and general automated graph filter learning framework that efficiently learns better filters capable of adapting to various complex graph signals.
arXiv Detail & Related papers (2023-07-16T06:14:12Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - Stacked Graph Filter [19.343260981528186]
We study Graph Convolutional Networks (GCN) from the graph signal processing viewpoint.
We find that by stacking graph filters with learnable solution parameters, we can build a highly adaptive and robust graph classification model.
arXiv Detail & Related papers (2020-11-22T11:20:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.