A Piece-wise Polynomial Filtering Approach for Graph Neural Networks
- URL: http://arxiv.org/abs/2112.03499v1
- Date: Tue, 7 Dec 2021 05:16:53 GMT
- Title: A Piece-wise Polynomial Filtering Approach for Graph Neural Networks
- Authors: Vijay Lingam, Chanakya Ekbote, Manan Sharma, Rahul Ragesh, Arun Iyer,
Sundararajan Sellamanickam
- Abstract summary: Graph Neural Networks (GNNs) exploit signals from node features and the input graph topology to improve node classification task performance.
These models tend to perform poorly on heterophilic graphs, where connected nodes have different labels.
We show that our model achieves performance gains of up to 5% over the state-of-theart models and outperforms existing filter-based approaches in general.
- Score: 0.45298395481707365
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) exploit signals from node features and the input
graph topology to improve node classification task performance. However, these
models tend to perform poorly on heterophilic graphs, where connected nodes
have different labels. Recently proposed GNNs work across graphs having varying
levels of homophily. Among these, models relying on polynomial graph filters
have shown promise. We observe that solutions to these polynomial graph filter
models are also solutions to an overdetermined system of equations. It suggests
that in some instances, the model needs to learn a reasonably high order
polynomial. On investigation, we find the proposed models ineffective at
learning such polynomials due to their designs. To mitigate this issue, we
perform an eigendecomposition of the graph and propose to learn multiple
adaptive polynomial filters acting on different subsets of the spectrum. We
theoretically and empirically show that our proposed model learns a better
filter, thereby improving classification accuracy. We study various aspects of
our proposed model including, dependency on the number of eigencomponents
utilized, latent polynomial filters learned, and performance of the individual
polynomials on the node classification task. We further show that our model is
scalable by evaluating over large graphs. Our model achieves performance gains
of up to 5% over the state-of-the-art models and outperforms existing
polynomial filter-based approaches in general.
Related papers
- GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - PolyFormer: Scalable Node-wise Filters via Polynomial Graph Transformer [22.287960384629585]
In this work, we propose a scalable node-wise filter, PolyAttn. Leveraging the attention mechanism, PolyAttn can directly learn node-wise filters in an efficient manner.
Our proposed methods excel at learning arbitrary node-wise filters, showing superior performance on both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2024-07-19T17:01:41Z) - Node-wise Filtering in Graph Neural Networks: A Mixture of Experts Approach [58.8524608686851]
Graph Neural Networks (GNNs) have proven to be highly effective for node classification tasks across diverse graph structural patterns.
Traditionally, GNNs employ a uniform global filter, typically a low-pass filter for homophilic graphs and a high-pass filter for heterophilic graphs.
We introduce a novel GNN framework Node-MoE that utilizes a mixture of experts to adaptively select the appropriate filters for different nodes.
arXiv Detail & Related papers (2024-06-05T17:12:38Z) - An Effective Universal Polynomial Basis for Spectral Graph Neural
Networks [12.725906836609811]
Spectral Graph Neural Networks (GNNs) have gained increasing prevalence for heterophily graphs.
We develop an adaptive heterophily basis by incorporating graph heterophily degrees.
We then integrate this heterophily basis with the homophily basis, creating a universal basis UniBasis.
arXiv Detail & Related papers (2023-11-30T01:48:42Z) - Automated Polynomial Filter Learning for Graph Neural Networks [9.120531252536617]
Polynomial graph filters have been widely used as guiding principles in the design of Graph Neural Networks (GNNs)
Recently, the adaptive learning of the graph filters has demonstrated promising performance for modeling graph signals on both homophilic and heterophilic graphs.
We propose Auto-Polynomial, a novel and general automated graph filter learning framework that efficiently learns better filters capable of adapting to various complex graph signals.
arXiv Detail & Related papers (2023-07-16T06:14:12Z) - Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Towards Better Graph Representation Learning with Parameterized
Decomposition & Filtering [27.374515964364814]
We develop a novel and general framework which unifies many existing GNN models.
We show how it helps to enhance the flexibility of GNNs while alleviating the smoothness and amplification issues of existing models.
arXiv Detail & Related papers (2023-05-10T12:42:31Z) - Graph Polynomial Convolution Models for Node Classification of
Non-Homophilous Graphs [52.52570805621925]
We investigate efficient learning from higher-order graph convolution and learning directly from adjacency matrix for node classification.
We show that the resulting model lead to new graphs and residual scaling parameter.
We demonstrate that the proposed methods obtain improved accuracy for node-classification of non-homophilous parameters.
arXiv Detail & Related papers (2022-09-12T04:46:55Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Stacked Graph Filter [19.343260981528186]
We study Graph Convolutional Networks (GCN) from the graph signal processing viewpoint.
We find that by stacking graph filters with learnable solution parameters, we can build a highly adaptive and robust graph classification model.
arXiv Detail & Related papers (2020-11-22T11:20:14Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.