Spectral Graph Attention Network with Fast Eigen-approximation
- URL: http://arxiv.org/abs/2003.07450v2
- Date: Tue, 27 Jul 2021 11:58:57 GMT
- Title: Spectral Graph Attention Network with Fast Eigen-approximation
- Authors: Heng Chang, Yu Rong, Tingyang Xu, Wenbing Huang, Somayeh Sojoudi,
Junzhou Huang, Wenwu Zhu
- Abstract summary: Spectral Graph Attention Network (SpGAT) learns representations for different frequency components regarding weighted filters and graph wavelets bases.
Fast approximation variant SpGAT-Cheby is proposed to reduce the computational cost brought by the eigen-decomposition.
We thoroughly evaluate the performance of SpGAT and SpGAT-Cheby in semi-supervised node classification tasks.
- Score: 103.93113062682633
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Variants of Graph Neural Networks (GNNs) for representation learning have
been proposed recently and achieved fruitful results in various fields. Among
them, Graph Attention Network (GAT) first employs a self-attention strategy to
learn attention weights for each edge in the spatial domain. However, learning
the attentions over edges can only focus on the local information of graphs and
greatly increases the computational costs. In this paper, we first introduce
the attention mechanism in the spectral domain of graphs and present Spectral
Graph Attention Network (SpGAT) that learns representations for different
frequency components regarding weighted filters and graph wavelets bases. In
this way, SpGAT can better capture global patterns of graphs in an efficient
manner with much fewer learned parameters than that of GAT. Further, to reduce
the computational cost of SpGAT brought by the eigen-decomposition, we propose
a fast approximation variant SpGAT-Cheby. We thoroughly evaluate the
performance of SpGAT and SpGAT-Cheby in semi-supervised node classification
tasks and verify the effectiveness of the learned attentions in the spectral
domain.
Related papers
- GrassNet: State Space Model Meets Graph Neural Network [57.62885438406724]
Graph State Space Network (GrassNet) is a novel graph neural network with theoretical support that provides a simple yet effective scheme for designing arbitrary graph spectral filters.
To the best of our knowledge, our work is the first to employ SSMs for the design of graph GNN spectral filters.
Extensive experiments on nine public benchmarks reveal that GrassNet achieves superior performance in real-world graph modeling tasks.
arXiv Detail & Related papers (2024-08-16T07:33:58Z) - Representation Learning on Heterophilic Graph with Directional
Neighborhood Attention [8.493802098034255]
Graph Attention Network (GAT) is one of the most popular Graph Neural Network (GNN) architecture.
GAT lacks the ability to capture long-range and global graph information, leading to unsatisfactory performance on some datasets.
We propose Directional Graph Attention Network (DGAT) to combine the feature-based attention with the global directional information extracted from the graph topology.
arXiv Detail & Related papers (2024-03-03T10:59:16Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Pointspectrum: Equivariance Meets Laplacian Filtering for Graph
Representation Learning [3.7875603451557063]
Graph Representation Learning (GRL) has become essential for modern graph data mining and learning tasks.
While Graph Neural Networks (GNNs) have been used in state-of-the-art GRL architectures, they have been shown to suffer from over smoothing.
We propose PointSpectrum, a spectral method that incorporates a set equivariant network to account for a graph's structure.
arXiv Detail & Related papers (2021-09-06T10:59:11Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Graph Networks with Spectral Message Passing [1.0742675209112622]
We introduce the Spectral Graph Network, which applies message passing to both the spatial and spectral domains.
Our results show that the Spectral GN promotes efficient training, reaching high performance with fewer training iterations despite having more parameters.
arXiv Detail & Related papers (2020-12-31T21:33:17Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.