Automated Polynomial Filter Learning for Graph Neural Networks
- URL: http://arxiv.org/abs/2307.07956v1
- Date: Sun, 16 Jul 2023 06:14:12 GMT
- Title: Automated Polynomial Filter Learning for Graph Neural Networks
- Authors: Wendi Yu, Zhichao Hou, Xiaorui Liu
- Abstract summary: Polynomial graph filters have been widely used as guiding principles in the design of Graph Neural Networks (GNNs)
Recently, the adaptive learning of the graph filters has demonstrated promising performance for modeling graph signals on both homophilic and heterophilic graphs.
We propose Auto-Polynomial, a novel and general automated graph filter learning framework that efficiently learns better filters capable of adapting to various complex graph signals.
- Score: 9.120531252536617
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Polynomial graph filters have been widely used as guiding principles in the
design of Graph Neural Networks (GNNs). Recently, the adaptive learning of the
polynomial graph filters has demonstrated promising performance for modeling
graph signals on both homophilic and heterophilic graphs, owning to their
flexibility and expressiveness. In this work, we conduct a novel preliminary
study to explore the potential and limitations of polynomial graph filter
learning approaches, revealing a severe overfitting issue. To improve the
effectiveness of polynomial graph filters, we propose Auto-Polynomial, a novel
and general automated polynomial graph filter learning framework that
efficiently learns better filters capable of adapting to various complex graph
signals. Comprehensive experiments and ablation studies demonstrate significant
and consistent performance improvements on both homophilic and heterophilic
graphs across multiple learning settings considering various labeling ratios,
which unleashes the potential of polynomial filter learning.
Related papers
- Node-wise Filtering in Graph Neural Networks: A Mixture of Experts Approach [58.8524608686851]
Graph Neural Networks (GNNs) have proven to be highly effective for node classification tasks across diverse graph structural patterns.
Traditionally, GNNs employ a uniform global filter, typically a low-pass filter for homophilic graphs and a high-pass filter for heterophilic graphs.
We introduce a novel GNN framework Node-MoE that utilizes a mixture of experts to adaptively select the appropriate filters for different nodes.
arXiv Detail & Related papers (2024-06-05T17:12:38Z) - Optimizing Polynomial Graph Filters: A Novel Adaptive Krylov Subspace Approach [36.06398179717066]
We develop an adaptive graph filter based on Krylov subspaces to filter complex graphs.
We conduct extensive experiments across a series of real-world datasets.
arXiv Detail & Related papers (2024-03-12T06:26:17Z) - An Effective Universal Polynomial Basis for Spectral Graph Neural
Networks [12.725906836609811]
Spectral Graph Neural Networks (GNNs) have gained increasing prevalence for heterophily graphs.
We develop an adaptive heterophily basis by incorporating graph heterophily degrees.
We then integrate this heterophily basis with the homophily basis, creating a universal basis UniBasis.
arXiv Detail & Related papers (2023-11-30T01:48:42Z) - Towards Better Graph Representation Learning with Parameterized
Decomposition & Filtering [27.374515964364814]
We develop a novel and general framework which unifies many existing GNN models.
We show how it helps to enhance the flexibility of GNNs while alleviating the smoothness and amplification issues of existing models.
arXiv Detail & Related papers (2023-05-10T12:42:31Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Graph Filters for Signal Processing and Machine Learning on Graphs [83.29608206147515]
We provide a comprehensive overview of graph filters, including the different filtering categories, design strategies for each type, and trade-offs between different types of graph filters.
We discuss how to extend graph filters into filter banks and graph neural networks to enhance the representational power.
Our aim is that this article provides a unifying framework for both beginner and experienced researchers, as well as a common understanding.
arXiv Detail & Related papers (2022-11-16T11:56:45Z) - A Piece-wise Polynomial Filtering Approach for Graph Neural Networks [0.45298395481707365]
Graph Neural Networks (GNNs) exploit signals from node features and the input graph topology to improve node classification task performance.
These models tend to perform poorly on heterophilic graphs, where connected nodes have different labels.
We show that our model achieves performance gains of up to 5% over the state-of-theart models and outperforms existing filter-based approaches in general.
arXiv Detail & Related papers (2021-12-07T05:16:53Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - Stacked Graph Filter [19.343260981528186]
We study Graph Convolutional Networks (GCN) from the graph signal processing viewpoint.
We find that by stacking graph filters with learnable solution parameters, we can build a highly adaptive and robust graph classification model.
arXiv Detail & Related papers (2020-11-22T11:20:14Z) - FiGLearn: Filter and Graph Learning using Optimal Transport [49.428169585114496]
We introduce a novel graph signal processing framework for learning the graph and its generating filter from signal observations.
We show how this framework can be used to infer missing values if only very little information is available.
arXiv Detail & Related papers (2020-10-29T10:00:42Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.