Graph Autoencoders with Deconvolutional Networks
- URL: http://arxiv.org/abs/2012.11898v1
- Date: Tue, 22 Dec 2020 09:49:39 GMT
- Title: Graph Autoencoders with Deconvolutional Networks
- Authors: Jia Li, Tomas Yu, Da-Cheng Juan, Arjun Gopalan, Hong Cheng, Andrew
Tomkins
- Abstract summary: Graph Deconvolutional Networks (GDNs) reconstruct graph signals from smoothed node representations.
We motivate the design of Graph Deconvolutional Networks via a combination of inverse filters in spectral domain and de-noising layers in wavelet domain.
Based on the proposed GDN, we propose a graph autoencoder framework that first encodes smoothed graph representations with GCN and then decodes accurate graph signals with GDN.
- Score: 32.78113728062279
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent studies have indicated that Graph Convolutional Networks (GCNs) act as
a \emph{low pass} filter in spectral domain and encode smoothed node
representations. In this paper, we consider their opposite, namely Graph
Deconvolutional Networks (GDNs) that reconstruct graph signals from smoothed
node representations. We motivate the design of Graph Deconvolutional Networks
via a combination of inverse filters in spectral domain and de-noising layers
in wavelet domain, as the inverse operation results in a \emph{high pass}
filter and may amplify the noise. Based on the proposed GDN, we further propose
a graph autoencoder framework that first encodes smoothed graph representations
with GCN and then decodes accurate graph signals with GDN. We demonstrate the
effectiveness of the proposed method on several tasks including unsupervised
graph-level representation , social recommendation and graph generation
Related papers
- UniG-Encoder: A Universal Feature Encoder for Graph and Hypergraph Node
Classification [6.977634174845066]
A universal feature encoder for both graph and hypergraph representation learning is designed, called UniG-Encoder.
The architecture starts with a forward transformation of the topological relationships of connected nodes into edge or hyperedge features.
The encoded node embeddings are then derived from the reversed transformation, described by the transpose of the projection matrix.
arXiv Detail & Related papers (2023-08-03T09:32:50Z) - Signed Graph Neural Networks: A Frequency Perspective [14.386571627652975]
Graph convolutional networks (GCNs) are designed for unsigned graphs containing only positive links.
We propose two different signed graph neural networks, one keeps only low-frequency information and one also retains high-frequency information.
arXiv Detail & Related papers (2022-08-15T16:42:18Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Deconvolutional Networks on Graph Data [33.95030071792298]
We propose Graph Deconvolutional Network (GDN) and motivate the design of GDN via a combination of inverse filters in spectral domain and de-noising layers in wavelet domain.
We demonstrate the effectiveness of the proposed method on several tasks including graph feature imputation and graph structure generation.
arXiv Detail & Related papers (2021-10-29T04:02:06Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Beyond Low-pass Filtering: Graph Convolutional Networks with Automatic
Filtering [61.315598419655224]
We propose Automatic Graph Convolutional Networks (AutoGCN) to capture the full spectrum of graph signals.
While it is based on graph spectral theory, our AutoGCN is also localized in space and has a spatial form.
arXiv Detail & Related papers (2021-07-10T04:11:25Z) - BiGCN: A Bi-directional Low-Pass Filtering Graph Neural Network [35.97496022085212]
Many graph convolutional networks can be regarded as low-pass filters for graph signals.
We propose a new model, BiGCN, which represents a graph neural network as a bi-directional low-pass filter.
Our model outperforms previous graph neural networks in the tasks of node classification and link prediction on most of the benchmark datasets.
arXiv Detail & Related papers (2021-01-14T09:41:00Z) - Inverse Graph Identification: Can We Identify Node Labels Given Graph
Labels? [89.13567439679709]
Graph Identification (GI) has long been researched in graph learning and is essential in certain applications.
This paper defines a novel problem dubbed Inverse Graph Identification (IGI)
We propose a simple yet effective method that makes the node-level message passing process using Graph Attention Network (GAT) under the protocol of GI.
arXiv Detail & Related papers (2020-07-12T12:06:17Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.