SteinGen: Generating Fidelitous and Diverse Graph Samples
- URL: http://arxiv.org/abs/2403.18578v2
- Date: Thu, 4 Apr 2024 22:38:02 GMT
- Title: SteinGen: Generating Fidelitous and Diverse Graph Samples
- Authors: Gesine Reinert, Wenkai Xu,
- Abstract summary: We introduce SteinGen to generate graphs from only one observed graph.
We show that SteinGen yields high distributional similarity (high fidelity) to the original data, combined with high sample diversity.
- Score: 11.582357781579997
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generating graphs that preserve characteristic structures while promoting sample diversity can be challenging, especially when the number of graph observations is small. Here, we tackle the problem of graph generation from only one observed graph. The classical approach of graph generation from parametric models relies on the estimation of parameters, which can be inconsistent or expensive to compute due to intractable normalisation constants. Generative modelling based on machine learning techniques to generate high-quality graph samples avoids parameter estimation but usually requires abundant training samples. Our proposed generating procedure, SteinGen, which is phrased in the setting of graphs as realisations of exponential random graph models, combines ideas from Stein's method and MCMC by employing Markovian dynamics which are based on a Stein operator for the target model. SteinGen uses the Glauber dynamics associated with an estimated Stein operator to generate a sample, and re-estimates the Stein operator from the sample after every sampling step. We show that on a class of exponential random graph models this novel "estimation and re-estimation" generation strategy yields high distributional similarity (high fidelity) to the original data, combined with high sample diversity.
Related papers
- SaGess: Sampling Graph Denoising Diffusion Model for Scalable Graph
Generation [7.66297856898883]
SaGess is able to generate large real-world networks by augmenting a diffusion model (DiGress) with a generalized divide-and-conquer framework.
SaGess outperforms most of the one-shot state-of-the-art graph generating methods by a significant factor.
arXiv Detail & Related papers (2023-06-29T10:02:39Z) - Creating generalizable downstream graph models with random projections [22.690120515637854]
We investigate graph representation learning approaches that enable models to generalize across graphs.
We show that using random projections to estimate multiple powers of the transition matrix allows us to build a set of isomorphism-invariant features.
The resulting features can be used to recover enough information about the local neighborhood of a node to enable inference with relevance competitive to other approaches.
arXiv Detail & Related papers (2023-02-17T14:27:00Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Node Copying: A Random Graph Model for Effective Graph Sampling [35.957719744856696]
We introduce the node copying model for constructing a distribution over graphs.
We show the usefulness of the copying model in three tasks.
We employ our proposed model to mitigate the effect of adversarial attacks on the graph topology.
arXiv Detail & Related papers (2022-08-04T04:04:49Z) - AgraSSt: Approximate Graph Stein Statistics for Interpretable Assessment
of Implicit Graph Generators [10.616967871198689]
We propose and analyse a novel statistical procedure, coined AgraSSt, to assess the quality of graph generators.
In particular, AgraSSt can be used to determine whether a learnt graph generating process is capable of generating graphs that resemble a given input graph.
We provide empirical results on both synthetic input graphs with known graph generation procedures, and real-world input graphs that the state-of-the-art (deep) generative models for graphs are trained on.
arXiv Detail & Related papers (2022-03-07T19:12:40Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - Scaling up graph homomorphism for classification via sampling [1.662966122370634]
We study the use of graph homomorphism density features as a scalable alternative to homomorphism numbers.
We propose a high-performance implementation of a simple sampling algorithm which computes additive approximations of homomorphism densities.
arXiv Detail & Related papers (2021-04-08T20:25:37Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.