AgraSSt: Approximate Graph Stein Statistics for Interpretable Assessment
of Implicit Graph Generators
- URL: http://arxiv.org/abs/2203.03673v4
- Date: Tue, 1 Aug 2023 05:01:34 GMT
- Title: AgraSSt: Approximate Graph Stein Statistics for Interpretable Assessment
of Implicit Graph Generators
- Authors: Wenkai Xu and Gesine Reinert
- Abstract summary: We propose and analyse a novel statistical procedure, coined AgraSSt, to assess the quality of graph generators.
In particular, AgraSSt can be used to determine whether a learnt graph generating process is capable of generating graphs that resemble a given input graph.
We provide empirical results on both synthetic input graphs with known graph generation procedures, and real-world input graphs that the state-of-the-art (deep) generative models for graphs are trained on.
- Score: 10.616967871198689
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose and analyse a novel statistical procedure, coined AgraSSt, to
assess the quality of graph generators that may not be available in explicit
form. In particular, AgraSSt can be used to determine whether a learnt graph
generating process is capable of generating graphs that resemble a given input
graph. Inspired by Stein operators for random graphs, the key idea of AgraSSt
is the construction of a kernel discrepancy based on an operator obtained from
the graph generator. AgraSSt can provide interpretable criticisms for a graph
generator training procedure and help identify reliable sample batches for
downstream tasks. Using Stein`s method we give theoretical guarantees for a
broad class of random graph models. We provide empirical results on both
synthetic input graphs with known graph generation procedures, and real-world
input graphs that the state-of-the-art (deep) generative models for graphs are
trained on.
Related papers
- Knowledge Probing for Graph Representation Learning [12.960185655357495]
We propose a novel graph probing framework (GraphProbe) to investigate and interpret whether the family of graph learning methods has encoded different levels of knowledge in graph representation learning.
Based on the intrinsic properties of graphs, we design three probes to systematically investigate the graph representation learning process from different perspectives.
We construct a thorough evaluation benchmark with nine representative graph learning methods from random walk based approaches, basic graph neural networks and self-supervised graph methods, and probe them on six benchmark datasets for node classification, link prediction and graph classification.
arXiv Detail & Related papers (2024-08-07T16:27:45Z) - SteinGen: Generating Fidelitous and Diverse Graph Samples [11.582357781579997]
We introduce SteinGen to generate graphs from only one observed graph.
We show that SteinGen yields high distributional similarity (high fidelity) to the original data, combined with high sample diversity.
arXiv Detail & Related papers (2024-03-27T13:59:05Z) - Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [22.794561387716502]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - GraphRCG: Self-Conditioned Graph Generation [78.69810678803248]
We propose a novel self-conditioned graph generation framework designed to explicitly model graph distributions.
Our framework demonstrates superior performance over existing state-of-the-art graph generation methods in terms of graph quality and fidelity to training data.
arXiv Detail & Related papers (2024-03-02T02:28:20Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - A Multi-scale Graph Signature for Persistence Diagrams based on Return
Probabilities of Random Walks [1.745838188269503]
We explore the use of a family of multi-scale graph signatures to enhance the robustness of topological features.
We propose a deep learning architecture to handle this set input.
Experiments on benchmark graph classification datasets demonstrate that our proposed architecture outperforms other persistent homology-based methods.
arXiv Detail & Related papers (2022-09-28T17:30:27Z) - Synthetic Graph Generation to Benchmark Graph Learning [7.914804101579097]
Graph learning algorithms have attained state-of-the-art performance on many graph analysis tasks.
One reason is due to the very small number of datasets used in practice to benchmark the performance of graph learning algorithms.
We propose to generate synthetic graphs, and study the behaviour of graph learning algorithms in a controlled scenario.
arXiv Detail & Related papers (2022-04-04T10:48:32Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.