AEGCN: An Autoencoder-Constrained Graph Convolutional Network
- URL: http://arxiv.org/abs/2007.03424v3
- Date: Wed, 10 Feb 2021 09:40:28 GMT
- Title: AEGCN: An Autoencoder-Constrained Graph Convolutional Network
- Authors: Mingyuan Ma, Sen Na, Hongyu Wang
- Abstract summary: We propose a novel neural network architecture, called autoencoder-constrained graph convolutional network.
The core of this model is a convolutional network operating directly on graphs, whose hidden layers are constrained by an autoencoder.
We show that adding autoencoder constraints significantly improves the performance of graph convolutional networks.
- Score: 5.023274927781062
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel neural network architecture, called
autoencoder-constrained graph convolutional network, to solve node
classification task on graph domains. As suggested by its name, the core of
this model is a convolutional network operating directly on graphs, whose
hidden layers are constrained by an autoencoder. Comparing with vanilla graph
convolutional networks, the autoencoder step is added to reduce the information
loss brought by Laplacian smoothing. We consider applying our model on both
homogeneous graphs and heterogeneous graphs. For homogeneous graphs, the
autoencoder approximates to the adjacency matrix of the input graph by taking
hidden layer representations as encoder and another one-layer graph
convolutional network as decoder. For heterogeneous graphs, since there are
multiple adjacency matrices corresponding to different types of edges, the
autoencoder approximates to the feature matrix of the input graph instead, and
changes the encoder to a particularly designed multi-channel pre-processing
network with two layers. In both cases, the error occurred in the autoencoder
approximation goes to the penalty term in the loss function. In extensive
experiments on citation networks and other heterogeneous graphs, we demonstrate
that adding autoencoder constraints significantly improves the performance of
graph convolutional networks. Further, we notice that our technique can be
applied on graph attention network to improve the performance as well. This
reveals the wide applicability of the proposed autoencoder technique.
Related papers
- Learning Network Representations with Disentangled Graph Auto-Encoder [1.671868610555776]
We introduce the Disentangled Graph Auto-Encoder (DGA) and the Disentangled Variational Graph Auto-Encoder (DVGA)
DGA is a convolutional network with multi-channel message-passing layers to serve as the encoder.
DVGA is a factor-wise decoder that takes into account the characteristics of disentangled representations.
arXiv Detail & Related papers (2024-02-02T04:52:52Z) - Deep Manifold Graph Auto-Encoder for Attributed Graph Embedding [51.75091298017941]
This paper proposes a novel Deep Manifold (Variational) Graph Auto-Encoder (DMVGAE/DMGAE) for attributed graph data.
The proposed method surpasses state-of-the-art baseline algorithms by a significant margin on different downstream tasks across popular datasets.
arXiv Detail & Related papers (2024-01-12T17:57:07Z) - UniG-Encoder: A Universal Feature Encoder for Graph and Hypergraph Node
Classification [6.977634174845066]
A universal feature encoder for both graph and hypergraph representation learning is designed, called UniG-Encoder.
The architecture starts with a forward transformation of the topological relationships of connected nodes into edge or hyperedge features.
The encoded node embeddings are then derived from the reversed transformation, described by the transpose of the projection matrix.
arXiv Detail & Related papers (2023-08-03T09:32:50Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Directed Graph Auto-Encoders [3.2873782624127843]
We introduce a new class of auto-encoders for directed graphs motivated by a direct extension of the Weisfeiler-Leman algorithm to pairs of node labels.
We demonstrate the ability of the proposed model to learn meaningful latent embeddings and achieve superior performance on the directed link prediction task.
arXiv Detail & Related papers (2022-02-25T01:19:47Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - Graph Autoencoders with Deconvolutional Networks [32.78113728062279]
Graph Deconvolutional Networks (GDNs) reconstruct graph signals from smoothed node representations.
We motivate the design of Graph Deconvolutional Networks via a combination of inverse filters in spectral domain and de-noising layers in wavelet domain.
Based on the proposed GDN, we propose a graph autoencoder framework that first encodes smoothed graph representations with GCN and then decodes accurate graph signals with GDN.
arXiv Detail & Related papers (2020-12-22T09:49:39Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Graph-Aware Transformer: Is Attention All Graphs Need? [5.240000443825077]
GRaph-Aware Transformer (GRAT) is first Transformer-based model which can encode and decode whole graphs in end-to-end fashion.
GRAT has shown very promising results including state-of-the-art performance on 4 regression tasks in QM9 benchmark.
arXiv Detail & Related papers (2020-06-09T12:13:56Z) - Adaptive Graph Auto-Encoder for General Data Clustering [90.8576971748142]
Graph-based clustering plays an important role in the clustering area.
Recent studies about graph convolution neural networks have achieved impressive success on graph type data.
We propose a graph auto-encoder for general data clustering, which constructs the graph adaptively according to the generative perspective of graphs.
arXiv Detail & Related papers (2020-02-20T10:11:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.