Quasi-Framelets: Another Improvement to GraphNeural Networks
- URL: http://arxiv.org/abs/2201.04728v1
- Date: Tue, 11 Jan 2022 00:10:28 GMT
- Title: Quasi-Framelets: Another Improvement to GraphNeural Networks
- Authors: Mengxi Yang, Xuebin Zheng, Jie Yin and Junbin Gao
- Abstract summary: spectral graph neural networks (GNNs) improve graph learning task performance via proposing various spectral filters in spectral domain to capture both global and local graph structure information.
Our new framelets convolution incorporates the filtering func-tions directly designed in the spectral domain to overcome these limitations.
The proposed convolution shows a great flexibility in cutting-off spectral information and effectively mitigate the negative effect of noisy graph signals.
- Score: 29.689868950666117
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper aims to provide a novel design of a multiscale framelets
convolution for spectral graph neural networks. In the spectral paradigm,
spectral GNNs improve graph learning task performance via proposing various
spectral filters in spectral domain to capture both global and local graph
structure information. Although the existing spectral approaches show superior
performance in some graphs, they suffer from lack of flexibility and being
fragile when graph information are incomplete or perturbated. Our new framelets
convolution incorporates the filtering func-tions directly designed in the
spectral domain to overcome these limitations. The proposed convolution shows a
great flexibility in cutting-off spectral information and effectively mitigate
the negative effect of noisy graph signals. Besides, to exploit the
heterogeneity in real-world graph data, the heterogeneous graph neural network
with our new framelet convolution provides a solution for embedding the
intrinsic topological information of meta-path with a multi-level graph
analysis.Extensive experiments have been conducted on real-world heterogeneous
graphs and homogeneous graphs under settings with noisy node features and
superior performance results are achieved.
Related papers
- GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Spectral Graph Reasoning Network for Hyperspectral Image Classification [0.43512163406551996]
Convolutional neural networks (CNNs) have achieved remarkable performance in hyperspectral image (HSI) classification.
We propose a spectral graph reasoning network (SGR) learning framework comprising two crucial modules.
Experiments on two HSI datasets demonstrate that the proposed architecture can significantly improve the classification accuracy.
arXiv Detail & Related papers (2024-07-02T20:29:23Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Gradient Gating for Deep Multi-Rate Learning on Graphs [62.25886489571097]
We present Gradient Gating (G$2$), a novel framework for improving the performance of Graph Neural Networks (GNNs)
Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph.
arXiv Detail & Related papers (2022-10-02T13:19:48Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Pointspectrum: Equivariance Meets Laplacian Filtering for Graph
Representation Learning [3.7875603451557063]
Graph Representation Learning (GRL) has become essential for modern graph data mining and learning tasks.
While Graph Neural Networks (GNNs) have been used in state-of-the-art GRL architectures, they have been shown to suffer from over smoothing.
We propose PointSpectrum, a spectral method that incorporates a set equivariant network to account for a graph's structure.
arXiv Detail & Related papers (2021-09-06T10:59:11Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Graph Neural Networks with Adaptive Frequency Response Filter [55.626174910206046]
We develop a graph neural network framework AdaGNN with a well-smooth adaptive frequency response filter.
We empirically validate the effectiveness of the proposed framework on various benchmark datasets.
arXiv Detail & Related papers (2021-04-26T19:31:21Z) - How Framelets Enhance Graph Neural Networks [27.540282741523253]
This paper presents a new approach for assembling graph neural networks based on framelet transforms.
We propose shrinkage as a new activation for the framelet convolution, which thresholds the high-frequency information at different scales.
arXiv Detail & Related papers (2021-02-13T19:19:19Z) - Bridging the Gap Between Spectral and Spatial Domains in Graph Neural
Networks [8.563354084119062]
We show some equivalence of the graph convolution process regardless it is designed in the spatial or the spectral domain.
The proposed framework is used to design new convolutions in spectral domain with a custom frequency profile while applying them in the spatial domain.
arXiv Detail & Related papers (2020-03-26T01:49:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.