BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein
Approximation
- URL: http://arxiv.org/abs/2106.10994v1
- Date: Mon, 21 Jun 2021 11:26:06 GMT
- Title: BernNet: Learning Arbitrary Graph Spectral Filters via Bernstein
Approximation
- Authors: Mingguo He, Zhewei Wei, Zengfeng Huang, Hongteng Xu
- Abstract summary: We propose $textitBernNet$, a novel graph neural network with theoretical support that provides a simple but effective scheme for designing and learning arbitrary graph spectral filters.
In particular, for any filter over the normalized Laplacian spectrum of a graph, our BernNet estimates it by an order-$K$ Bernstein approximation and designs its spectral property by setting the coefficients of the Bernstein basis.
Our experiments demonstrate that BernNet can learn arbitrary spectral filters, including complicated band-rejection and comb filters, and it achieves superior performance in real-world graph modeling tasks.
- Score: 47.88982193039535
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many representative graph neural networks, $e.g.$, GPR-GNN and ChebyNet,
approximate graph convolutions with graph spectral filters. However, existing
work either applies predefined filter weights or learns them without necessary
constraints, which may lead to oversimplified or ill-posed filters. To overcome
these issues, we propose $\textit{BernNet}$, a novel graph neural network with
theoretical support that provides a simple but effective scheme for designing
and learning arbitrary graph spectral filters. In particular, for any filter
over the normalized Laplacian spectrum of a graph, our BernNet estimates it by
an order-$K$ Bernstein polynomial approximation and designs its spectral
property by setting the coefficients of the Bernstein basis. Moreover, we can
learn the coefficients (and the corresponding filter weights) based on observed
graphs and their associated signals and thus achieve the BernNet specialized
for the data. Our experiments demonstrate that BernNet can learn arbitrary
spectral filters, including complicated band-rejection and comb filters, and it
achieves superior performance in real-world graph modeling tasks.
Related papers
- GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - Graph Filters for Signal Processing and Machine Learning on Graphs [83.29608206147515]
We provide a comprehensive overview of graph filters, including the different filtering categories, design strategies for each type, and trade-offs between different types of graph filters.
We discuss how to extend graph filters into filter banks and graph neural networks to enhance the representational power.
Our aim is that this article provides a unifying framework for both beginner and experienced researchers, as well as a common understanding.
arXiv Detail & Related papers (2022-11-16T11:56:45Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Framework for Designing Filters of Spectral Graph Convolutional Neural
Networks in the Context of Regularization Theory [1.0152838128195467]
Graph convolutional neural networks (GCNNs) have been widely used in graph learning.
It has been observed that the smoothness functional on graphs can be defined in terms of the graph Laplacian.
In this work, we explore the regularization properties of graph Laplacian and proposed a generalized framework for regularized filter designs in spectral GCNNs.
arXiv Detail & Related papers (2020-09-29T06:19:08Z) - Bridging the Gap Between Spectral and Spatial Domains in Graph Neural
Networks [8.563354084119062]
We show some equivalence of the graph convolution process regardless it is designed in the spatial or the spectral domain.
The proposed framework is used to design new convolutions in spectral domain with a custom frequency profile while applying them in the spatial domain.
arXiv Detail & Related papers (2020-03-26T01:49:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.