Beyond Low-pass Filtering: Graph Convolutional Networks with Automatic
Filtering
- URL: http://arxiv.org/abs/2107.04755v1
- Date: Sat, 10 Jul 2021 04:11:25 GMT
- Title: Beyond Low-pass Filtering: Graph Convolutional Networks with Automatic
Filtering
- Authors: Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Chengqi Zhang
- Abstract summary: We propose Automatic Graph Convolutional Networks (AutoGCN) to capture the full spectrum of graph signals.
While it is based on graph spectral theory, our AutoGCN is also localized in space and has a spatial form.
- Score: 61.315598419655224
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Graph convolutional networks are becoming indispensable for deep learning
from graph-structured data. Most of the existing graph convolutional networks
share two big shortcomings. First, they are essentially low-pass filters, thus
the potentially useful middle and high frequency band of graph signals are
ignored. Second, the bandwidth of existing graph convolutional filters is
fixed. Parameters of a graph convolutional filter only transform the graph
inputs without changing the curvature of a graph convolutional filter function.
In reality, we are uncertain about whether we should retain or cut off the
frequency at a certain point unless we have expert domain knowledge. In this
paper, we propose Automatic Graph Convolutional Networks (AutoGCN) to capture
the full spectrum of graph signals and automatically update the bandwidth of
graph convolutional filters. While it is based on graph spectral theory, our
AutoGCN is also localized in space and has a spatial form. Experimental results
show that AutoGCN achieves significant improvement over baseline methods which
only work as low-pass filters.
Related papers
- Online Graph Filtering Over Expanding Graphs [14.594691605523005]
We propose an online graph filtering framework by relying on online learning principles.
We design filters for scenarios where the topology is both known and unknown, including a learner adaptive to such evolution.
We conduct a regret analysis to highlight the role played by the different components such as the online algorithm, the filter order, and the growing graph model.
arXiv Detail & Related papers (2024-09-11T11:50:16Z) - Online Filtering over Expanding Graphs [14.84852576248587]
We propose an online update of the filter, based on the principles of online machine learning.
We show the performance of our method for signal at the incoming nodes.
These findings lay the foundation for efficient filtering over expanding graphs.
arXiv Detail & Related papers (2023-01-17T14:07:52Z) - Graph Filters for Signal Processing and Machine Learning on Graphs [83.29608206147515]
We provide a comprehensive overview of graph filters, including the different filtering categories, design strategies for each type, and trade-offs between different types of graph filters.
We discuss how to extend graph filters into filter banks and graph neural networks to enhance the representational power.
Our aim is that this article provides a unifying framework for both beginner and experienced researchers, as well as a common understanding.
arXiv Detail & Related papers (2022-11-16T11:56:45Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - Message Passing in Graph Convolution Networks via Adaptive Filter Banks [81.12823274576274]
We present a novel graph convolution operator, termed BankGCN.
It decomposes multi-channel signals on graphs into subspaces and handles particular information in each subspace with an adapted filter.
It achieves excellent performance in graph classification on a collection of benchmark graph datasets.
arXiv Detail & Related papers (2021-06-18T04:23:34Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - BiGCN: A Bi-directional Low-Pass Filtering Graph Neural Network [35.97496022085212]
Many graph convolutional networks can be regarded as low-pass filters for graph signals.
We propose a new model, BiGCN, which represents a graph neural network as a bi-directional low-pass filter.
Our model outperforms previous graph neural networks in the tasks of node classification and link prediction on most of the benchmark datasets.
arXiv Detail & Related papers (2021-01-14T09:41:00Z) - Graph Autoencoders with Deconvolutional Networks [32.78113728062279]
Graph Deconvolutional Networks (GDNs) reconstruct graph signals from smoothed node representations.
We motivate the design of Graph Deconvolutional Networks via a combination of inverse filters in spectral domain and de-noising layers in wavelet domain.
Based on the proposed GDN, we propose a graph autoencoder framework that first encodes smoothed graph representations with GCN and then decodes accurate graph signals with GDN.
arXiv Detail & Related papers (2020-12-22T09:49:39Z) - Filter Grafting for Deep Neural Networks: Reason, Method, and
Cultivation [86.91324735966766]
Filter is the key component in modern convolutional neural networks (CNNs)
In this paper, we introduce filter grafting (textbfMethod) to achieve this goal.
We develop a novel criterion to measure the information of filters and an adaptive weighting strategy to balance the grafted information among networks.
arXiv Detail & Related papers (2020-04-26T08:36:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.