Generating the Graph Gestalt: Kernel-Regularized Graph Representation
Learning
- URL: http://arxiv.org/abs/2106.15239v1
- Date: Tue, 29 Jun 2021 10:48:28 GMT
- Title: Generating the Graph Gestalt: Kernel-Regularized Graph Representation
Learning
- Authors: Kiarash Zahirnia, Ankita Sakhuja, Oliver Schulte, Parmis Nadaf, Ke Li,
Xia Hu
- Abstract summary: A complete scientific understanding of graph data should address both global and local structure.
We propose a joint model for both as complementary objectives in a graph VAE framework.
Our experiments demonstrate a significant improvement in the realism of the generated graph structures, typically by 1-2 orders of magnitude of graph structure metrics.
- Score: 47.506013386710954
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Recent work on graph generative models has made remarkable progress towards
generating increasingly realistic graphs, as measured by global graph features
such as degree distribution, density, and clustering coefficients. Deep
generative models have also made significant advances through better modelling
of the local correlations in the graph topology, which have been very useful
for predicting unobserved graph components, such as the existence of a link or
the class of a node, from nearby observed graph components. A complete
scientific understanding of graph data should address both global and local
structure. In this paper, we propose a joint model for both as complementary
objectives in a graph VAE framework. Global structure is captured by
incorporating graph kernels in a probabilistic model whose loss function is
closely related to the maximum mean discrepancy(MMD) between the global
structures of the reconstructed and the input graphs. The ELBO objective
derived from the model regularizes a standard local link reconstruction term
with an MMD term. Our experiments demonstrate a significant improvement in the
realism of the generated graph structures, typically by 1-2 orders of magnitude
of graph structure metrics, compared to leading graph VAEand GAN models. Local
link reconstruction improves as well in many cases.
Related papers
- GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Structure-free Graph Condensation: From Large-scale Graphs to Condensed
Graph-free Data [91.27527985415007]
Existing graph condensation methods rely on the joint optimization of nodes and structures in the condensed graph.
We advocate a new Structure-Free Graph Condensation paradigm, named SFGC, to distill a large-scale graph into a small-scale graph node set.
arXiv Detail & Related papers (2023-06-05T07:53:52Z) - HiGen: Hierarchical Graph Generative Networks [2.3931689873603603]
Most real-world graphs exhibit a hierarchical structure, which is often overlooked by existing graph generation methods.
We propose a novel graph generative network that captures the hierarchical nature of graphs and successively generates the graph sub-structures in a coarse-to-fine fashion.
This modular approach enables scalable graph generation for large and complex graphs.
arXiv Detail & Related papers (2023-05-30T18:04:12Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - Graph Condensation via Receptive Field Distribution Matching [61.71711656856704]
This paper focuses on creating a small graph to represent the original graph, so that GNNs trained on the size-reduced graph can make accurate predictions.
We view the original graph as a distribution of receptive fields and aim to synthesize a small graph whose receptive fields share a similar distribution.
arXiv Detail & Related papers (2022-06-28T02:10:05Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Adaptive Graph Auto-Encoder for General Data Clustering [90.8576971748142]
Graph-based clustering plays an important role in the clustering area.
Recent studies about graph convolution neural networks have achieved impressive success on graph type data.
We propose a graph auto-encoder for general data clustering, which constructs the graph adaptively according to the generative perspective of graphs.
arXiv Detail & Related papers (2020-02-20T10:11:28Z) - Graph Signal Processing -- Part III: Machine Learning on Graphs, from
Graph Topology to Applications [19.29066508374268]
Part III of this monograph starts by addressing ways to learn graph topology.
A particular emphasis is on graph topology definition based on the correlation and precision matrices of the observed data.
For learning sparse graphs, the least absolute shrinkage and selection operator, known as LASSO is employed.
arXiv Detail & Related papers (2020-01-02T13:14:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.