An Accurate Graph Generative Model with Tunable Features
- URL: http://arxiv.org/abs/2309.01158v1
- Date: Sun, 3 Sep 2023 12:34:15 GMT
- Title: An Accurate Graph Generative Model with Tunable Features
- Authors: Takahiro Yokoyama, Yoshiki Sato, Sho Tsugawa, Kohei Watabe
- Abstract summary: We propose a method to improve the accuracy of GraphTune by adding a new mechanism to feed back errors of graph features.
Experiments on a real-world graph dataset showed that the features in the generated graphs are accurately tuned compared with conventional models.
- Score: 0.8192907805418583
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A graph is a very common and powerful data structure used for modeling
communication and social networks. Models that generate graphs with arbitrary
features are important basic technologies in repeated simulations of networks
and prediction of topology changes. Although existing generative models for
graphs are useful for providing graphs similar to real-world graphs, graph
generation models with tunable features have been less explored in the field.
Previously, we have proposed GraphTune, a generative model for graphs that
continuously tune specific graph features of generated graphs while maintaining
most of the features of a given graph dataset. However, the tuning accuracy of
graph features in GraphTune has not been sufficient for practical applications.
In this paper, we propose a method to improve the accuracy of GraphTune by
adding a new mechanism to feed back errors of graph features of generated
graphs and by training them alternately and independently. Experiments on a
real-world graph dataset showed that the features in the generated graphs are
accurately tuned compared with conventional models.
Related papers
- Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [22.794561387716502]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - GraphRCG: Self-Conditioned Graph Generation [78.69810678803248]
We propose a novel self-conditioned graph generation framework designed to explicitly model graph distributions.
Our framework demonstrates superior performance over existing state-of-the-art graph generation methods in terms of graph quality and fidelity to training data.
arXiv Detail & Related papers (2024-03-02T02:28:20Z) - GraphMaker: Can Diffusion Models Generate Large Attributed Graphs? [7.330479039715941]
Large-scale graphs with node attributes are increasingly common in various real-world applications.
Traditional graph generation methods are limited in their capacity to handle these complex structures.
This paper introduces a novel diffusion model, GraphMaker, specifically designed for generating large attributed graphs.
arXiv Detail & Related papers (2023-10-20T22:12:46Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - GraphTune: A Learning-based Graph Generative Model with Tunable
Structural Features [3.3248768737711045]
We propose a generative model that allows us to tune the value of a global-level structural feature as a condition.
Our model, called GraphTune, makes it possible to tune the value of any structural feature of generated graphs.
arXiv Detail & Related papers (2022-01-27T13:14:53Z) - Graph2Graph Learning with Conditional Autoregressive Models [8.203106789678397]
We present a conditional auto-re model for graph-to-graph learning.
We illustrate its representational capabilities via experiments on challenging subgraph predictions from graph algorithmics.
arXiv Detail & Related papers (2021-06-06T20:28:07Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - A Tunable Model for Graph Generation Using LSTM and Conditional VAE [1.399948157377307]
We propose a generative model that can tune specific features, while learning structural features of a graph from data.
With a dataset of graphs with various features generated by a model, we confirm that our model can generate a graph with specific features.
arXiv Detail & Related papers (2021-04-15T06:47:14Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.