Scattering GCN: Overcoming Oversmoothness in Graph Convolutional
Networks
- URL: http://arxiv.org/abs/2003.08414v4
- Date: Tue, 18 Jan 2022 21:07:01 GMT
- Title: Scattering GCN: Overcoming Oversmoothness in Graph Convolutional
Networks
- Authors: Yimeng Min (1), Frederik Wenkel (2 and 1), Guy Wolf (2 and 1) ((1)
Mila - Quebec AI Institute, Montr\'eal, QC, Canada, (2) Department of
Mathematics & Statistics, Universit\'e de Montr\'eal, Montr\'eal, QC, Canada)
- Abstract summary: Graph convolutional networks (GCNs) have shown promising results in processing graph data by extracting structure-aware features.
Here, we propose to augment conventional GCNs with geometric scattering transforms and residual convolutions.
The former enables band-pass filtering of graph signals, thus alleviating the so-called oversmoothing often encountered in GCNs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolutional networks (GCNs) have shown promising results in
processing graph data by extracting structure-aware features. This gave rise to
extensive work in geometric deep learning, focusing on designing network
architectures that ensure neuron activations conform to regularity patterns
within the input graph. However, in most cases the graph structure is only
accounted for by considering the similarity of activations between adjacent
nodes, which limits the capabilities of such methods to discriminate between
nodes in a graph. Here, we propose to augment conventional GCNs with geometric
scattering transforms and residual convolutions. The former enables band-pass
filtering of graph signals, thus alleviating the so-called oversmoothing often
encountered in GCNs, while the latter is introduced to clear the resulting
features of high-frequency noise. We establish the advantages of the presented
Scattering GCN with both theoretical results establishing the complementary
benefits of scattering and GCN features, as well as experimental results
showing the benefits of our method compared to leading graph neural networks
for semi-supervised node classification, including the recently proposed GAT
network that typically alleviates oversmoothing using graph attention
mechanisms.
Related papers
- Neighbor Overlay-Induced Graph Attention Network [5.792501481702088]
Graph neural networks (GNNs) have garnered significant attention due to their ability to represent graph data.
This study proposes a neighbor overlay-induced graph attention network (NO-GAT) with the following two-fold ideas.
Empirical studies on graph benchmark datasets indicate that the proposed NO-GAT consistently outperforms state-of-the-art models.
arXiv Detail & Related papers (2024-08-16T15:01:28Z) - Probability Passing for Graph Neural Networks: Graph Structure and Representations Joint Learning [8.392545965667288]
Graph Neural Networks (GNNs) have achieved notable success in the analysis of non-Euclidean data across a wide range of domains.
To solve this problem, Latent Graph Inference (LGI) is proposed to infer a task-specific latent structure by computing similarity or edge probability of node features.
We introduce a novel method called Probability Passing to refine the generated graph structure by aggregating edge probabilities of neighboring nodes.
arXiv Detail & Related papers (2024-07-15T13:01:47Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Graph Neural Networks for Graph Drawing [17.983238300054527]
We propose a novel framework for the development of Graph Neural Drawers (GND)
GNDs rely on neural computation for constructing efficient and complex maps.
We prove that this mechanism can be guided by loss functions computed by means of Feedforward Neural Networks.
arXiv Detail & Related papers (2021-09-21T09:58:02Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Geometric Scattering Attention Networks [14.558882688159297]
We introduce a new attention-based architecture to produce adaptive task-driven node representations.
We show the resulting geometric scattering attention network (GSAN) outperforms previous networks in semi-supervised node classification.
arXiv Detail & Related papers (2020-10-28T14:36:40Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.