LayerDAG: A Layerwise Autoregressive Diffusion Model for Directed Acyclic Graph Generation
- URL: http://arxiv.org/abs/2411.02322v1
- Date: Mon, 04 Nov 2024 17:47:15 GMT
- Title: LayerDAG: A Layerwise Autoregressive Diffusion Model for Directed Acyclic Graph Generation
- Authors: Mufei Li, Viraj Shitole, Eli Chien, Changhai Man, Zhaodong Wang, Srinivas Sridharan, Ying Zhang, Tushar Krishna, Pan Li,
- Abstract summary: This paper introduces LayerDAG, an autoregressive diffusion model, to generate realistic directed acyclic graphs (DAGs)
By interpreting the partial order of nodes as a sequence of bipartite graphs, LayerDAG decouples the strong node dependencies into manageable units that can be processed sequentially.
Experiments on both synthetic and real-world flow graphs from various computing platforms show that LayerDAG generates valid DAGs with superior statistical properties and benchmarking performance.
- Score: 17.94316378710172
- License:
- Abstract: Directed acyclic graphs (DAGs) serve as crucial data representations in domains such as hardware synthesis and compiler/program optimization for computing systems. DAG generative models facilitate the creation of synthetic DAGs, which can be used for benchmarking computing systems while preserving intellectual property. However, generating realistic DAGs is challenging due to their inherent directional and logical dependencies. This paper introduces LayerDAG, an autoregressive diffusion model, to address these challenges. LayerDAG decouples the strong node dependencies into manageable units that can be processed sequentially. By interpreting the partial order of nodes as a sequence of bipartite graphs, LayerDAG leverages autoregressive generation to model directional dependencies and employs diffusion models to capture logical dependencies within each bipartite graph. Comparative analyses demonstrate that LayerDAG outperforms existing DAG generative models in both expressiveness and generalization, particularly for generating large-scale DAGs with up to 400 nodes-a critical scenario for system benchmarking. Extensive experiments on both synthetic and real-world flow graphs from various computing platforms show that LayerDAG generates valid DAGs with superior statistical properties and benchmarking performance. The synthetic DAGs generated by LayerDAG enhance the training of ML-based surrogate models, resulting in improved accuracy in predicting performance metrics of real-world DAGs across diverse computing platforms.
Related papers
- SeaDAG: Semi-autoregressive Diffusion for Conditional Directed Acyclic Graph Generation [83.52157311471693]
We introduce SeaDAG, a semi-autoregressive diffusion model for conditional generation of Directed Acyclic Graphs (DAGs)
Unlike conventional autoregressive generation that lacks a global graph structure view, our method maintains a complete graph structure at each diffusion step.
We explicitly train the model to learn graph conditioning with a condition loss, which enhances the diffusion model's capacity to generate realistic DAGs.
arXiv Detail & Related papers (2024-10-21T15:47:03Z) - Scalable Weibull Graph Attention Autoencoder for Modeling Document Networks [50.42343781348247]
We develop a graph Poisson factor analysis (GPFA) which provides analytic conditional posteriors to improve the inference accuracy.
We also extend GPFA to a multi-stochastic-layer version named graph Poisson gamma belief network (GPGBN) to capture the hierarchical document relationships at multiple semantic levels.
Our models can extract high-quality hierarchical latent document representations and achieve promising performance on various graph analytic tasks.
arXiv Detail & Related papers (2024-10-13T02:22:14Z) - Language Models are Graph Learners [70.14063765424012]
Language Models (LMs) are challenging the dominance of domain-specific models, including Graph Neural Networks (GNNs) and Graph Transformers (GTs)
We propose a novel approach that empowers off-the-shelf LMs to achieve performance comparable to state-of-the-art GNNs on node classification tasks.
arXiv Detail & Related papers (2024-10-03T08:27:54Z) - Convolutional Learning on Directed Acyclic Graphs [10.282099295800322]
We develop a novel convolutional architecture tailored for learning from data defined over directed acyclic graphs (DAGs)
We develop a novel convolutional graph neural network that integrates learnable DAG filters to account for the partial ordering induced by the graph topology.
arXiv Detail & Related papers (2024-05-05T21:30:18Z) - Global Minima, Recoverability Thresholds, and Higher-Order Structure in
GNNS [0.0]
We analyze the performance of graph neural network (GNN) architectures from the perspective of random graph theory.
We show how both specific higher-order structures in synthetic data and the mix of empirical structures in real data have dramatic effects on GNN performance.
arXiv Detail & Related papers (2023-10-11T17:16:33Z) - Hyperbolic Graph Diffusion Model [24.049660417511074]
We propose a novel graph generation method called, Hyperbolic Graph Diffusion Model (HGDM)
HGDM consists of an auto-encoder to encode nodes into successive hyperbolic embeddings, and a DM that operates in the hyperbolic latent space.
Experiments show that HGDM achieves better performance in generic graph and molecule generation benchmarks, with a $48%$ improvement in the quality of graph generation with highly hierarchical structures.
arXiv Detail & Related papers (2023-06-13T08:22:18Z) - H-GCN: A Graph Convolutional Network Accelerator on Versal ACAP
Architecture [13.149863422504332]
H-GCN partitions each graph into three subgraphs based on its inherent heterogeneity, and processes them using PL and AIE, respectively.
Compared with state-of-the-art GNN accelerators, H-GCN achieves, on average, speedups of 1.12.3X.
arXiv Detail & Related papers (2022-06-28T03:37:31Z) - BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery [97.79015388276483]
A structural equation model (SEM) is an effective framework to reason over causal relationships represented via a directed acyclic graph (DAG)
Recent advances enabled effective maximum-likelihood point estimation of DAGs from observational data.
We propose BCD Nets, a variational framework for estimating a distribution over DAGs characterizing a linear-Gaussian SEM.
arXiv Detail & Related papers (2021-12-06T03:35:21Z) - Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text
Generation [56.73834525802723]
Lightweight Dynamic Graph Convolutional Networks (LDGCNs) are proposed.
LDGCNs capture richer non-local interactions by synthesizing higher order information from the input graphs.
We develop two novel parameter saving strategies based on the group graph convolutions and weight tied convolutions to reduce memory usage and model complexity.
arXiv Detail & Related papers (2020-10-09T06:03:46Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.