A Tunable Model for Graph Generation Using LSTM and Conditional VAE
- URL: http://arxiv.org/abs/2104.09304v1
- Date: Thu, 15 Apr 2021 06:47:14 GMT
- Title: A Tunable Model for Graph Generation Using LSTM and Conditional VAE
- Authors: Shohei Nakazawa, Yoshiki Sato, Kenji Nakagawa, Sho Tsugawa, Kohei
Watabe
- Abstract summary: We propose a generative model that can tune specific features, while learning structural features of a graph from data.
With a dataset of graphs with various features generated by a model, we confirm that our model can generate a graph with specific features.
- Score: 1.399948157377307
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the development of graph applications, generative models for graphs have
been more crucial. Classically, stochastic models that generate graphs with a
pre-defined probability of edges and nodes have been studied. Recently, some
models that reproduce the structural features of graphs by learning from actual
graph data using machine learning have been studied. However, in these
conventional studies based on machine learning, structural features of graphs
can be learned from data, but it is not possible to tune features and generate
graphs with specific features. In this paper, we propose a generative model
that can tune specific features, while learning structural features of a graph
from data. With a dataset of graphs with various features generated by a
stochastic model, we confirm that our model can generate a graph with specific
features.
Related papers
- Parametric Graph Representations in the Era of Foundation Models: A Survey and Position [69.48708136448694]
Graphs have been widely used in the past decades of big data and AI to model comprehensive relational data.
Identifying meaningful graph laws can significantly enhance the effectiveness of various applications.
arXiv Detail & Related papers (2024-10-16T00:01:31Z) - Advancing Graph Generation through Beta Diffusion [49.49740940068255]
Graph Beta Diffusion (GBD) is a generative model specifically designed to handle the diverse nature of graph data.
We propose a modulation technique that enhances the realism of generated graphs by stabilizing critical graph topology.
arXiv Detail & Related papers (2024-06-13T17:42:57Z) - GraphMaker: Can Diffusion Models Generate Large Attributed Graphs? [7.330479039715941]
Large-scale graphs with node attributes are increasingly common in various real-world applications.
Traditional graph generation methods are limited in their capacity to handle these complex structures.
This paper introduces a novel diffusion model, GraphMaker, specifically designed for generating large attributed graphs.
arXiv Detail & Related papers (2023-10-20T22:12:46Z) - An Accurate Graph Generative Model with Tunable Features [0.8192907805418583]
We propose a method to improve the accuracy of GraphTune by adding a new mechanism to feed back errors of graph features.
Experiments on a real-world graph dataset showed that the features in the generated graphs are accurately tuned compared with conventional models.
arXiv Detail & Related papers (2023-09-03T12:34:15Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - GraphTune: A Learning-based Graph Generative Model with Tunable
Structural Features [3.3248768737711045]
We propose a generative model that allows us to tune the value of a global-level structural feature as a condition.
Our model, called GraphTune, makes it possible to tune the value of any structural feature of generated graphs.
arXiv Detail & Related papers (2022-01-27T13:14:53Z) - GraphOpt: Learning Optimization Models of Graph Formation [72.75384705298303]
We propose an end-to-end framework that learns an implicit model of graph structure formation and discovers an underlying optimization mechanism.
The learned objective can serve as an explanation for the observed graph properties, thereby lending itself to transfer across different graphs within a domain.
GraphOpt poses link formation in graphs as a sequential decision-making process and solves it using maximum entropy inverse reinforcement learning algorithm.
arXiv Detail & Related papers (2020-07-07T16:51:39Z) - Non-Parametric Graph Learning for Bayesian Graph Neural Networks [35.88239188555398]
We propose a novel non-parametric graph model for constructing the posterior distribution of graph adjacency matrices.
We demonstrate the advantages of this model in three different problem settings: node classification, link prediction and recommendation.
arXiv Detail & Related papers (2020-06-23T21:10:55Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.