Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets
- URL: http://arxiv.org/abs/2108.01660v2
- Date: Wed, 4 Aug 2021 07:04:07 GMT
- Title: Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets
- Authors: Mingxing Xu, Wenrui Dai, Chenglin Li, Junni Zou, Hongkai Xiong and
Pascal Frossard
- Abstract summary: Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
- Score: 81.63035727821145
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spectral graph convolutional networks (SGCNs) have been attracting increasing
attention in graph representation learning partly due to their interpretability
through the prism of the established graph signal processing framework.
However, existing SGCNs are limited in implementing graph convolutions with
rigid transforms that could not adapt to signals residing on graphs and tasks
at hand. In this paper, we propose a novel class of spectral graph
convolutional networks that implement graph convolutions with adaptive graph
wavelets. Specifically, the adaptive graph wavelets are learned with neural
network-parameterized lifting structures, where structure-aware attention-based
lifting operations are developed to jointly consider graph structures and node
features. We propose to lift based on diffusion wavelets to alleviate the
structural information loss induced by partitioning non-bipartite graphs. By
design, the locality and sparsity of the resulting wavelet transform as well as
the scalability of the lifting structure for large and varying-size graphs are
guaranteed. We further derive a soft-thresholding filtering operation by
learning sparse graph representations in terms of the learned wavelets, which
improves the scalability and interpretablity, and yield a localized, efficient
and scalable spectral graph convolution. To ensure that the learned graph
representations are invariant to node permutations, a layer is employed at the
input of the networks to reorder the nodes according to their local topology
information. We evaluate the proposed networks in both node-level and
graph-level representation learning tasks on benchmark citation and
bioinformatics graph datasets. Extensive experiments demonstrate the
superiority of the proposed networks over existing SGCNs in terms of accuracy,
efficiency and scalability.
Related papers
- Improving Graph Neural Networks by Learning Continuous Edge Directions [0.0]
Graph Neural Networks (GNNs) traditionally employ a message-passing mechanism that resembles diffusion over undirected graphs.
Our key insight is to assign fuzzy edge directions to the edges of a graph so that features can preferentially flow in one direction between nodes.
We propose a general framework, called Continuous Edge Direction (CoED) GNN, for learning on graphs with fuzzy edges.
arXiv Detail & Related papers (2024-10-18T01:34:35Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Deconvolutional Networks on Graph Data [33.95030071792298]
We propose Graph Deconvolutional Network (GDN) and motivate the design of GDN via a combination of inverse filters in spectral domain and de-noising layers in wavelet domain.
We demonstrate the effectiveness of the proposed method on several tasks including graph feature imputation and graph structure generation.
arXiv Detail & Related papers (2021-10-29T04:02:06Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - How Framelets Enhance Graph Neural Networks [27.540282741523253]
This paper presents a new approach for assembling graph neural networks based on framelet transforms.
We propose shrinkage as a new activation for the framelet convolution, which thresholds the high-frequency information at different scales.
arXiv Detail & Related papers (2021-02-13T19:19:19Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.