LGDC: Latent Graph Diffusion via Spectrum-Preserving Coarsening
- URL: http://arxiv.org/abs/2512.01190v1
- Date: Mon, 01 Dec 2025 02:10:24 GMT
- Title: LGDC: Latent Graph Diffusion via Spectrum-Preserving Coarsening
- Authors: Nagham Osman, Keyue Jiang, Davide Buffelli, Xiaowen Dong, Laura Toni,
- Abstract summary: Graph generation is a critical task across scientific domains.<n>Existing methods fall broadly into two categories: autoregressive models, which iteratively expand graphs, and one-shot models, such as diffusion, which generate the full graph at once.<n>We propose LGDC (latent graph diffusion via spectrum-preserving coarsening), a hybrid framework that combines strengths of both approaches.
- Score: 11.920007650086662
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph generation is a critical task across scientific domains. Existing methods fall broadly into two categories: autoregressive models, which iteratively expand graphs, and one-shot models, such as diffusion, which generate the full graph at once. In this work, we provide an analysis of these two paradigms and reveal a key trade-off: autoregressive models stand out in capturing fine-grained local structures, such as degree and clustering properties, whereas one-shot models excel at modeling global patterns, such as spectral distributions. Building on this, we propose LGDC (latent graph diffusion via spectrum-preserving coarsening), a hybrid framework that combines strengths of both approaches. LGDC employs a spectrum-preserving coarsening-decoarsening to bidirectionally map between graphs and a latent space, where diffusion efficiently generates latent graphs before expansion restores detail. This design captures both local and global properties with improved efficiency. Empirically, LGDC matches autoregressive models on locally structured datasets (Tree) and diffusion models on globally structured ones (Planar, Community-20), validating the benefits of hybrid generation.
Related papers
- Do Graph Diffusion Models Accurately Capture and Generate Substructure Distributions? [28.19526635775658]
Diffusion models do not possess universal expressivity to accurately model the distribution scores of complex graph data.<n>Our work addresses this limitation by focusing on the frequency of specific substructures as a key characteristic of target graph distributions.<n>We establish a theoretical connection between the expressivity of Graph Neural Networks (GNNs) and the overall performance of graph diffusion models.
arXiv Detail & Related papers (2025-02-04T17:04:16Z) - SeaDAG: Semi-autoregressive Diffusion for Conditional Directed Acyclic Graph Generation [83.52157311471693]
We introduce SeaDAG, a semi-autoregressive diffusion model for conditional generation of Directed Acyclic Graphs (DAGs)
Unlike conventional autoregressive generation that lacks a global graph structure view, our method maintains a complete graph structure at each diffusion step.
We explicitly train the model to learn graph conditioning with a condition loss, which enhances the diffusion model's capacity to generate realistic DAGs.
arXiv Detail & Related papers (2024-10-21T15:47:03Z) - IFH: a Diffusion Framework for Flexible Design of Graph Generative Models [53.219279193440734]
Graph generative models can be classified into two prominent families: one-shot models, which generate a graph in one go, and sequential models, which generate a graph by successive additions of nodes and edges.
This paper proposes a graph generative model, called Insert-Fill-Halt (IFH), that supports the specification of a sequentiality degree.
arXiv Detail & Related papers (2024-08-23T16:24:40Z) - Advancing Graph Generation through Beta Diffusion [49.49740940068255]
Graph Beta Diffusion (GBD) is a generative model specifically designed to handle the diverse nature of graph data.
We propose a modulation technique that enhances the realism of generated graphs by stabilizing critical graph topology.
arXiv Detail & Related papers (2024-06-13T17:42:57Z) - GraphRCG: Self-Conditioned Graph Generation [78.69810678803248]
We propose a novel self-conditioned graph generation framework designed to explicitly model graph distributions.
Our framework demonstrates superior performance over existing state-of-the-art graph generation methods in terms of graph quality and fidelity to training data.
arXiv Detail & Related papers (2024-03-02T02:28:20Z) - Hyperbolic Graph Diffusion Model [24.049660417511074]
We propose a novel graph generation method called, Hyperbolic Graph Diffusion Model (HGDM)
HGDM consists of an auto-encoder to encode nodes into successive hyperbolic embeddings, and a DM that operates in the hyperbolic latent space.
Experiments show that HGDM achieves better performance in generic graph and molecule generation benchmarks, with a $48%$ improvement in the quality of graph generation with highly hierarchical structures.
arXiv Detail & Related papers (2023-06-13T08:22:18Z) - Generative Diffusion Models on Graphs: Methods and Applications [50.44334458963234]
Diffusion models, as a novel generative paradigm, have achieved remarkable success in various image generation tasks.
Graph generation is a crucial computational task on graphs with numerous real-world applications.
arXiv Detail & Related papers (2023-02-06T06:58:17Z) - Fast Graph Generative Model via Spectral Diffusion [38.31052833073743]
We argue that running full-rank diffusion SDEs on the whole space hinders diffusion models from learning graph topology generation.
We propose an efficient yet effective Graph Spectral Diffusion Model (GSDM), which is driven by low-rank diffusion SDEs on the graph spectrum space.
arXiv Detail & Related papers (2022-11-16T12:56:32Z) - Generating the Graph Gestalt: Kernel-Regularized Graph Representation
Learning [47.506013386710954]
A complete scientific understanding of graph data should address both global and local structure.
We propose a joint model for both as complementary objectives in a graph VAE framework.
Our experiments demonstrate a significant improvement in the realism of the generated graph structures, typically by 1-2 orders of magnitude of graph structure metrics.
arXiv Detail & Related papers (2021-06-29T10:48:28Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.