Graph Spectral Filtering with Chebyshev Interpolation for Recommendation
- URL: http://arxiv.org/abs/2505.00552v1
- Date: Thu, 01 May 2025 14:28:44 GMT
- Title: Graph Spectral Filtering with Chebyshev Interpolation for Recommendation
- Authors: Chanwoo Kim, Jinkyu Sung, Yebonn Han, Joonseok Lee,
- Abstract summary: We introduce ChebyCF, a graph convolutional filtering framework based on graph spectral filtering.<n>ChebyCF achieves state-of-the-art performance across multiple benchmarks and reasonably fast inference.
- Score: 13.336257299673578
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional networks have recently gained prominence in collaborative filtering (CF) for recommendations. However, we identify potential bottlenecks in two foundational components. First, the embedding layer leads to a latent space with limited capacity, overlooking locally observed but potentially valuable preference patterns. Also, the widely-used neighborhood aggregation is limited in its ability to leverage diverse preference patterns in a fine-grained manner. Building on spectral graph theory, we reveal that these limitations stem from graph filtering with a cut-off in the frequency spectrum and a restricted linear form. To address these issues, we introduce ChebyCF, a CF framework based on graph spectral filtering. Instead of a learned embedding, it takes a user's raw interaction history to utilize the full spectrum of signals contained in it. Also, it adopts Chebyshev interpolation to effectively approximate a flexible non-linear graph filter, and further enhances it by using an additional ideal pass filter and degree-based normalization. Through extensive experiments, we verify that ChebyCF overcomes the aforementioned bottlenecks and achieves state-of-the-art performance across multiple benchmarks and reasonably fast inference. Our code is available at https://github.com/chanwoo0806/ChebyCF.
Related papers
- How Powerful is Graph Filtering for Recommendation [7.523823738965443]
We show two limitations suppressing the power of graph filtering.
Due to varied noise distribution, graph filters fail to denoise sparse data where noise is scattered across all frequencies.
supervised training results in worse performance on dense data where noise is concentrated in middle frequencies that can be removed by graph filters without training.
arXiv Detail & Related papers (2024-06-13T05:37:54Z) - Advancing Graph Convolutional Networks via General Spectral Wavelets [41.41593198072709]
We present a novel wavelet-based graph convolution network, namely WaveGC, which integrates multi-resolution spectral bases and a matrix-valued filter kernel.
Theoretically, we establish that WaveGC can effectively capture and decouple short-range and long-range information, providing superior filtering flexibility.
arXiv Detail & Related papers (2024-05-22T16:32:27Z) - Shape-aware Graph Spectral Learning [36.63516222161871]
Spectral Graph Neural Networks (GNNs) are gaining attention for their ability to surpass the limitations of message-passing GNNs.
Some works empirically show that the preferred graph frequency is related to the graph homophily level.
This relationship between graph frequency and graphs with homophily/heterophily has not been systematically analyzed and considered in existing spectral GNNs.
We propose shape-aware regularization on a Newton Interpolation-based spectral filter that can learn an arbitrary spectral filter and incorporate prior knowledge about the desired shape of the corresponding homophily level.
arXiv Detail & Related papers (2023-10-16T04:57:30Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - Convolutional Neural Networks on Graphs with Chebyshev Approximation,
Revisited [80.29747781984135]
ChebNet is one of the early attempts to approximate the spectral graph convolutions using Chebyshevs.
GCN simplifies ChebNet by utilizing only the first two Chebyshevs while still outperforming it on real-world datasets.
We propose ChebNetII, a new GNN model based on Chebyshev, which enhances the original Chebyshev approximation while reducing the Runge phenomenon.
arXiv Detail & Related papers (2022-02-04T09:11:13Z) - Beyond Low-pass Filtering: Graph Convolutional Networks with Automatic
Filtering [61.315598419655224]
We propose Automatic Graph Convolutional Networks (AutoGCN) to capture the full spectrum of graph signals.
While it is based on graph spectral theory, our AutoGCN is also localized in space and has a spatial form.
arXiv Detail & Related papers (2021-07-10T04:11:25Z) - Message Passing in Graph Convolution Networks via Adaptive Filter Banks [81.12823274576274]
We present a novel graph convolution operator, termed BankGCN.
It decomposes multi-channel signals on graphs into subspaces and handles particular information in each subspace with an adapted filter.
It achieves excellent performance in graph classification on a collection of benchmark graph datasets.
arXiv Detail & Related papers (2021-06-18T04:23:34Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Stacked Graph Filter [19.343260981528186]
We study Graph Convolutional Networks (GCN) from the graph signal processing viewpoint.
We find that by stacking graph filters with learnable solution parameters, we can build a highly adaptive and robust graph classification model.
arXiv Detail & Related papers (2020-11-22T11:20:14Z) - Framework for Designing Filters of Spectral Graph Convolutional Neural
Networks in the Context of Regularization Theory [1.0152838128195467]
Graph convolutional neural networks (GCNNs) have been widely used in graph learning.
It has been observed that the smoothness functional on graphs can be defined in terms of the graph Laplacian.
In this work, we explore the regularization properties of graph Laplacian and proposed a generalized framework for regularized filter designs in spectral GCNNs.
arXiv Detail & Related papers (2020-09-29T06:19:08Z) - Dependency Aware Filter Pruning [74.69495455411987]
Pruning a proportion of unimportant filters is an efficient way to mitigate the inference cost.
Previous work prunes filters according to their weight norms or the corresponding batch-norm scaling factors.
We propose a novel mechanism to dynamically control the sparsity-inducing regularization so as to achieve the desired sparsity.
arXiv Detail & Related papers (2020-05-06T07:41:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.