Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks
- URL: http://arxiv.org/abs/2201.08932v1
- Date: Sat, 22 Jan 2022 00:47:41 GMT
- Title: Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks
- Authors: Frederik Wenkel, Yimeng Min, Matthew Hirn, Michael Perlmutter, Guy
Wolf
- Abstract summary: We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
- Score: 11.857894213975644
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Geometric deep learning (GDL) has made great strides towards generalizing the
design of structure-aware neural network architectures from traditional domains
to non-Euclidean ones, such as graphs. This gave rise to graph neural network
(GNN) models that can be applied to graph-structured datasets arising, for
example, in social networks, biochemistry, and material science. Graph
convolutional networks (GCNs) in particular, inspired by their Euclidean
counterparts, have been successful in processing graph data by extracting
structure-aware features. However, current GNN models (and GCNs in particular)
are known to be constrained by various phenomena that limit their expressive
power and ability to generalize to more complex graph datasets. Most models
essentially rely on low-pass filtering of graph signals via local averaging
operations, thus leading to oversmoothing. Here, we propose a hybrid GNN
framework that combines traditional GCN filters with band-pass filters defined
via the geometric scattering transform. We further introduce an attention
framework that allows the model to locally attend over the combined information
from different GNN filters at the node level. Our theoretical results establish
the complementary benefits of the scattering filters to leverage structural
information from the graph, while our experiments show the benefits of our
method on various learning tasks.
Related papers
- Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Spiking Graph Convolutional Networks [19.36064180392385]
SpikingGCN is an end-to-end framework that aims to integrate the embedding of GCNs with the biofidelity characteristics of SNNs.
We show that SpikingGCN on a neuromorphic chip can bring a clear advantage of energy efficiency into graph data analysis.
arXiv Detail & Related papers (2022-05-05T16:44:36Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.