Cross-Space Adaptive Filter: Integrating Graph Topology and Node
Attributes for Alleviating the Over-smoothing Problem
- URL: http://arxiv.org/abs/2401.14876v2
- Date: Sat, 10 Feb 2024 08:58:14 GMT
- Title: Cross-Space Adaptive Filter: Integrating Graph Topology and Node
Attributes for Alleviating the Over-smoothing Problem
- Authors: Chen Huang, Haoyang Li, Yifan Zhang, Wenqiang Lei, Jiancheng Lv
- Abstract summary: A Graph Convolutional Network (GCN) uses a low-pass filter to extract low-frequency signals from graph topology.
Various methods have been proposed to create an adaptive filter by incorporating an extra filter extracted from the graph topology.
We propose a cross-space adaptive filter, called CSF, to produce the adaptive-frequency information extracted from both the topology and attribute spaces.
- Score: 39.347616859256256
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The vanilla Graph Convolutional Network (GCN) uses a low-pass filter to
extract low-frequency signals from graph topology, which may lead to the
over-smoothing problem when GCN goes deep. To this end, various methods have
been proposed to create an adaptive filter by incorporating an extra filter
(e.g., a high-pass filter) extracted from the graph topology. However, these
methods heavily rely on topological information and ignore the node attribute
space, which severely sacrifices the expressive power of the deep GCNs,
especially when dealing with disassortative graphs. In this paper, we propose a
cross-space adaptive filter, called CSF, to produce the adaptive-frequency
information extracted from both the topology and attribute spaces.
Specifically, we first derive a tailored attribute-based high-pass filter that
can be interpreted theoretically as a minimizer for semi-supervised kernel
ridge regression. Then, we cast the topology-based low-pass filter as a
Mercer's kernel within the context of GCNs. This serves as a foundation for
combining it with the attribute-based filter to capture the adaptive-frequency
information. Finally, we derive the cross-space filter via an effective
multiple-kernel learning strategy, which unifies the attribute-based high-pass
filter and the topology-based low-pass filter. This helps to address the
over-smoothing problem while maintaining effectiveness. Extensive experiments
demonstrate that CSF not only successfully alleviates the over-smoothing
problem but also promotes the effectiveness of the node classification task.
Related papers
- Generalized Learning of Coefficients in Spectral Graph Convolutional Networks [5.5711773076846365]
Spectral Graph Convolutional Networks (GCNs) have gained popularity in graph machine learning applications.
G-Arnoldi-GCN consistently outperforms state-of-the-art methods when suitable functions are employed.
arXiv Detail & Related papers (2024-09-07T12:53:44Z) - Message Passing in Graph Convolution Networks via Adaptive Filter Banks [81.12823274576274]
We present a novel graph convolution operator, termed BankGCN.
It decomposes multi-channel signals on graphs into subspaces and handles particular information in each subspace with an adapted filter.
It achieves excellent performance in graph classification on a collection of benchmark graph datasets.
arXiv Detail & Related papers (2021-06-18T04:23:34Z) - DNN-Based Topology Optimisation: Spatial Invariance and Neural Tangent
Kernel [7.106986689736828]
We study the SIMP method with a density field generated by a fully-connected neural network, taking the coordinates as inputs.
We show that the use of DNNs leads to a filtering effect similar to traditional filtering techniques for SIMP, with a filter described by the Neural Tangent Kernel (NTK)
arXiv Detail & Related papers (2021-06-10T12:49:55Z) - Resolution learning in deep convolutional networks using scale-space
theory [31.275270391367425]
Resolution in deep convolutional neural networks (CNNs) is typically bounded by the receptive field size through filter sizes, and subsampling layers or strided convolutions on feature maps.
We propose to do away with hard-coded resolution hyper- parameters and aim to learn the appropriate resolution from data.
We use scale-space theory to obtain a self-similar parametrization of filters and make use of the N-Jet: a truncated Taylor series to approximate a filter by a learned combination of Gaussian derivative filters.
arXiv Detail & Related papers (2021-06-07T08:23:02Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - SCOP: Scientific Control for Reliable Neural Network Pruning [127.20073865874636]
This paper proposes a reliable neural network pruning algorithm by setting up a scientific control.
Redundant filters can be discovered in the adversarial process of different features.
Our method can reduce 57.8% parameters and 60.2% FLOPs of ResNet-101 with only 0.01% top-1 accuracy loss on ImageNet.
arXiv Detail & Related papers (2020-10-21T03:02:01Z) - Filter Grafting for Deep Neural Networks: Reason, Method, and
Cultivation [86.91324735966766]
Filter is the key component in modern convolutional neural networks (CNNs)
In this paper, we introduce filter grafting (textbfMethod) to achieve this goal.
We develop a novel criterion to measure the information of filters and an adaptive weighting strategy to balance the grafted information among networks.
arXiv Detail & Related papers (2020-04-26T08:36:26Z) - Filter Grafting for Deep Neural Networks [71.39169475500324]
Filter grafting aims to improve the representation capability of Deep Neural Networks (DNNs)
We develop an entropy-based criterion to measure the information of filters and an adaptive weighting strategy for balancing the grafted information among networks.
For example, the grafted MobileNetV2 outperforms the non-grafted MobileNetV2 by about 7 percent on CIFAR-100 dataset.
arXiv Detail & Related papers (2020-01-15T03:18:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.