Deconvolutional Networks on Graph Data
- URL: http://arxiv.org/abs/2110.15528v1
- Date: Fri, 29 Oct 2021 04:02:06 GMT
- Title: Deconvolutional Networks on Graph Data
- Authors: Jia Li, Jiajin Li, Yang Liu, Jianwei Yu, Yueting Li, Hong Cheng
- Abstract summary: We propose Graph Deconvolutional Network (GDN) and motivate the design of GDN via a combination of inverse filters in spectral domain and de-noising layers in wavelet domain.
We demonstrate the effectiveness of the proposed method on several tasks including graph feature imputation and graph structure generation.
- Score: 33.95030071792298
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we consider an inverse problem in graph learning domain --
``given the graph representations smoothed by Graph Convolutional Network
(GCN), how can we reconstruct the input graph signal?" We propose Graph
Deconvolutional Network (GDN) and motivate the design of GDN via a combination
of inverse filters in spectral domain and de-noising layers in wavelet domain,
as the inverse operation results in a high frequency amplifier and may amplify
the noise. We demonstrate the effectiveness of the proposed method on several
tasks including graph feature imputation and graph structure generation.
Related papers
- Signed Graph Neural Networks: A Frequency Perspective [14.386571627652975]
Graph convolutional networks (GCNs) are designed for unsigned graphs containing only positive links.
We propose two different signed graph neural networks, one keeps only low-frequency information and one also retains high-frequency information.
arXiv Detail & Related papers (2022-08-15T16:42:18Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Adaptive Filters in Graph Convolutional Neural Networks [0.0]
Graph Neural Networks (GNN) have gained a high interest because of their potential in processing graph-structured data.
This paper presents a novel method to adapt the behaviour of a ConvGNN to the input proposing a method to perform spatial convolution on graphs.
arXiv Detail & Related papers (2021-05-21T14:36:39Z) - Graph Autoencoders with Deconvolutional Networks [32.78113728062279]
Graph Deconvolutional Networks (GDNs) reconstruct graph signals from smoothed node representations.
We motivate the design of Graph Deconvolutional Networks via a combination of inverse filters in spectral domain and de-noising layers in wavelet domain.
Based on the proposed GDN, we propose a graph autoencoder framework that first encodes smoothed graph representations with GCN and then decodes accurate graph signals with GDN.
arXiv Detail & Related papers (2020-12-22T09:49:39Z) - Unrolling of Deep Graph Total Variation for Image Denoising [106.93258903150702]
In this paper, we combine classical graph signal filtering with deep feature learning into a competitive hybrid design.
We employ interpretable analytical low-pass graph filters and employ 80% fewer network parameters than state-of-the-art DL denoising scheme DnCNN.
arXiv Detail & Related papers (2020-10-21T20:04:22Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.