Dirichlet Graph Variational Autoencoder
- URL: http://arxiv.org/abs/2010.04408v2
- Date: Wed, 18 Nov 2020 12:14:22 GMT
- Title: Dirichlet Graph Variational Autoencoder
- Authors: Jia Li, Tomasyu Yu, Jiajin Li, Honglei Zhang, Kangfei Zhao, YU Rong,
Hong Cheng, Junzhou Huang
- Abstract summary: We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
- Score: 65.94744123832338
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) and Variational Autoencoders (VAEs) have been
widely used in modeling and generating graphs with latent factors. However,
there is no clear explanation of what these latent factors are and why they
perform well. In this work, we present Dirichlet Graph Variational Autoencoder
(DGVAE) with graph cluster memberships as latent factors. Our study connects
VAEs based graph generation and balanced graph cut, and provides a new way to
understand and improve the internal mechanism of VAEs based graph generation.
Specifically, we first interpret the reconstruction term of DGVAE as balanced
graph cut in a principled way. Furthermore, motivated by the low pass
characteristics in balanced graph cut, we propose a new variant of GNN named
Heatts to encode the input graph into cluster memberships. Heatts utilizes the
Taylor series for fast computation of heat kernels and has better low pass
characteristics than Graph Convolutional Networks (GCN). Through experiments on
graph generation and graph clustering, we demonstrate the effectiveness of our
proposed framework.
Related papers
- Theoretical Insights into Line Graph Transformation on Graph Learning [3.0574700762497744]
Line graph transformation has been widely studied in graph theory, where each node in a line graph corresponds to an edge in the original graph.
This has inspired a series of graph neural networks (GNNs) applied to transformed line graphs, which have proven effective in various graph representation learning tasks.
In this study, we focus on two types of graphs known to be challenging to the Weisfeiler-Leman (WL) tests: Cai-F"urer-Immerman (CFI) graphs and strongly regular graphs.
arXiv Detail & Related papers (2024-10-21T16:04:50Z) - Greener GRASS: Enhancing GNNs with Encoding, Rewiring, and Attention [12.409982249220812]
We introduce Graph Attention with Structures (GRASS), a novel GNN architecture, to enhance graph relative attention.
GRASS rewires the input graph by superimposing a random regular graph to achieve long-range information propagation.
It also employs a novel additive attention mechanism tailored for graph-structured data.
arXiv Detail & Related papers (2024-07-08T06:21:56Z) - Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [22.794561387716502]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - CCGG: A Deep Autoregressive Model for Class-Conditional Graph Generation [7.37333913697359]
We introduce the Class Conditioned Graph Generator (CCGG) to generate graphs with desired features.
CCGG outperforms existing conditional graph generation methods on various datasets.
It also manages to maintain the quality of the generated graphs in terms of distribution-based evaluation metrics.
arXiv Detail & Related papers (2021-10-07T21:24:07Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Adaptive Graph Auto-Encoder for General Data Clustering [90.8576971748142]
Graph-based clustering plays an important role in the clustering area.
Recent studies about graph convolution neural networks have achieved impressive success on graph type data.
We propose a graph auto-encoder for general data clustering, which constructs the graph adaptively according to the generative perspective of graphs.
arXiv Detail & Related papers (2020-02-20T10:11:28Z) - Graph Deconvolutional Generation [3.5138314002170192]
We focus on the modern equivalent of the Erdos-Renyi random graph model: the graph variational autoencoder (GVAE)
GVAE has difficulty matching the training distribution and relies on an expensive graph matching procedure.
We improve this class of models by building a message passing neural network into GVAE's encoder and decoder.
arXiv Detail & Related papers (2020-02-14T04:37:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.