Optimizing Polynomial Graph Filters: A Novel Adaptive Krylov Subspace Approach
- URL: http://arxiv.org/abs/2403.07954v2
- Date: Tue, 21 May 2024 02:47:03 GMT
- Title: Optimizing Polynomial Graph Filters: A Novel Adaptive Krylov Subspace Approach
- Authors: Keke Huang, Wencai Cao, Hoang Ta, Xiaokui Xiao, Pietro LiĆ²,
- Abstract summary: We develop an adaptive graph filter based on Krylov subspaces to filter complex graphs.
We conduct extensive experiments across a series of real-world datasets.
- Score: 36.06398179717066
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Graph Neural Networks (GNNs), known as spectral graph filters, find a wide range of applications in web networks. To bypass eigendecomposition, polynomial graph filters are proposed to approximate graph filters by leveraging various polynomial bases for filter training. However, no existing studies have explored the diverse polynomial graph filters from a unified perspective for optimization. In this paper, we first unify polynomial graph filters, as well as the optimal filters of identical degrees into the Krylov subspace of the same order, thus providing equivalent expressive power theoretically. Next, we investigate the asymptotic convergence property of polynomials from the unified Krylov subspace perspective, revealing their limited adaptability in graphs with varying heterophily degrees. Inspired by those facts, we design a novel adaptive Krylov subspace approach to optimize polynomial bases with provable controllability over the graph spectrum so as to adapt various heterophily graphs. Subsequently, we propose AdaptKry, an optimized polynomial graph filter utilizing bases from the adaptive Krylov subspaces. Meanwhile, in light of the diverse spectral properties of complex graphs, we extend AdaptKry by leveraging multiple adaptive Krylov bases without incurring extra training costs. As a consequence, extended AdaptKry is able to capture the intricate characteristics of graphs and provide insights into their inherent complexity. We conduct extensive experiments across a series of real-world datasets. The experimental results demonstrate the superior filtering capability of AdaptKry, as well as the optimized efficacy of the adaptive Krylov basis.
Related papers
- Node-wise Filtering in Graph Neural Networks: A Mixture of Experts Approach [58.8524608686851]
Graph Neural Networks (GNNs) have proven to be highly effective for node classification tasks across diverse graph structural patterns.
Traditionally, GNNs employ a uniform global filter, typically a low-pass filter for homophilic graphs and a high-pass filter for heterophilic graphs.
We introduce a novel GNN framework Node-MoE that utilizes a mixture of experts to adaptively select the appropriate filters for different nodes.
arXiv Detail & Related papers (2024-06-05T17:12:38Z) - How Universal Polynomial Bases Enhance Spectral Graph Neural Networks: Heterophily, Over-smoothing, and Over-squashing [24.857304431611464]
Spectral Graph Networks (GNNs) have gained increasing prevalence for heterophily graphs.
In an attempt to avert prohibitive computations, numerous filters have been proposed.
We demystify the correlation between the spectral property of desired vectors and the heterophily degrees.
We develop a novel adaptive heterophily basis wherein the basis mutually form angles reflecting the heterophily degree of the graph.
arXiv Detail & Related papers (2024-05-21T03:28:45Z) - An Effective Universal Polynomial Basis for Spectral Graph Neural
Networks [12.725906836609811]
Spectral Graph Neural Networks (GNNs) have gained increasing prevalence for heterophily graphs.
We develop an adaptive heterophily basis by incorporating graph heterophily degrees.
We then integrate this heterophily basis with the homophily basis, creating a universal basis UniBasis.
arXiv Detail & Related papers (2023-11-30T01:48:42Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Automated Polynomial Filter Learning for Graph Neural Networks [9.120531252536617]
Polynomial graph filters have been widely used as guiding principles in the design of Graph Neural Networks (GNNs)
Recently, the adaptive learning of the graph filters has demonstrated promising performance for modeling graph signals on both homophilic and heterophilic graphs.
We propose Auto-Polynomial, a novel and general automated graph filter learning framework that efficiently learns better filters capable of adapting to various complex graph signals.
arXiv Detail & Related papers (2023-07-16T06:14:12Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - A Piece-wise Polynomial Filtering Approach for Graph Neural Networks [0.45298395481707365]
Graph Neural Networks (GNNs) exploit signals from node features and the input graph topology to improve node classification task performance.
These models tend to perform poorly on heterophilic graphs, where connected nodes have different labels.
We show that our model achieves performance gains of up to 5% over the state-of-theart models and outperforms existing filter-based approaches in general.
arXiv Detail & Related papers (2021-12-07T05:16:53Z) - Message Passing in Graph Convolution Networks via Adaptive Filter Banks [81.12823274576274]
We present a novel graph convolution operator, termed BankGCN.
It decomposes multi-channel signals on graphs into subspaces and handles particular information in each subspace with an adapted filter.
It achieves excellent performance in graph classification on a collection of benchmark graph datasets.
arXiv Detail & Related papers (2021-06-18T04:23:34Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Stacked Graph Filter [19.343260981528186]
We study Graph Convolutional Networks (GCN) from the graph signal processing viewpoint.
We find that by stacking graph filters with learnable solution parameters, we can build a highly adaptive and robust graph classification model.
arXiv Detail & Related papers (2020-11-22T11:20:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.