Adaptive Edge Learning for Density-Aware Graph Generation
- URL: http://arxiv.org/abs/2601.23052v1
- Date: Fri, 30 Jan 2026 15:01:50 GMT
- Title: Adaptive Edge Learning for Density-Aware Graph Generation
- Authors: Seyedeh Ava Razi Razavi, James Sargant, Sheridan Houghten, Renata Dividino,
- Abstract summary: We propose a density-aware conditional graph generation framework using Wasserstein GANs (WGAN)<n>A differentiable edge predictor determines pairwise relationships directly from node embeddings.<n>A density-aware selection mechanism adaptively controls edge density to match class-specific sparsity distributions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generating realistic graph-structured data is challenging due to discrete structures, variable sizes, and class-specific connectivity patterns that resist conventional generative modelling. While recent graph generation methods employ generative adversarial network (GAN) frameworks to handle permutation invariance and irregular topologies, they typically rely on random edge sampling with fixed probabilities, limiting their capacity to capture complex structural dependencies between nodes. We propose a density-aware conditional graph generation framework using Wasserstein GANs (WGAN) that replaces random sampling with a learnable distance-based edge predictor. Our approach embeds nodes into a latent space where proximity correlates with edge likelihood, enabling the generator to learn meaningful connectivity patterns. A differentiable edge predictor determines pairwise relationships directly from node embeddings, while a density-aware selection mechanism adaptively controls edge density to match class-specific sparsity distributions observed in real graphs. We train the model using a WGAN with gradient penalty, employing a GCN-based critic to ensure generated graphs exhibit realistic topology and align with target class distributions. Experiments on benchmark datasets demonstrate that our method produces graphs with superior structural coherence and class-consistent connectivity compared to existing baselines. The learned edge predictor captures complex relational patterns beyond simple heuristics, generating graphs whose density and topology closely match real structural distributions. Our results show improved training stability and controllable synthesis, making the framework effective for realistic graph generation and data augmentation. Source code is publicly available at https://github.com/ava-12/Density_Aware_WGAN.git.
Related papers
- Graph Signal Generative Diffusion Models [74.75869068073577]
We introduce U-shaped encoder-decoder graph neural networks (U-GNNs) for graph signal generation using denoising diffusion processes.<n>The architecture learns node features at different resolutions with skip connections between the encoder and decoder paths.<n>We demonstrate the effectiveness of the diffusion model in probabilistic forecasting of stock prices.
arXiv Detail & Related papers (2025-09-21T21:57:27Z) - Bures-Wasserstein Flow Matching for Graph Generation [10.21877898746333]
We present a theoretically grounded framework for probability path construction in graph generative models.<n>We then leverage the optimal transport displacement between MRF objects to design a smooth probability path.<n> BWFlow is a flow-matching framework for graph generation that utilizes the derived optimal probability path.
arXiv Detail & Related papers (2025-06-16T21:36:56Z) - Adaptive Homophily Clustering: Structure Homophily Graph Learning with Adaptive Filter for Hyperspectral Image [21.709368882043897]
Hyperspectral image (HSI) clustering has been a fundamental but challenging task with zero training labels.<n>In this paper, a homophily structure graph learning with an adaptive filter clustering method (AHSGC) for HSI is proposed.<n>Our AHSGC contains high clustering accuracy, low computational complexity, and strong robustness.
arXiv Detail & Related papers (2025-01-03T01:54:16Z) - Edge Probability Graph Models Beyond Edge Independency: Concepts, Analyses, and Algorithms [40.154779118370875]
Desirable random graph models (RGMs) should reproduce common patterns in real-world graphs.<n>A common class of RGMs (e.g., Erdos-Renyi and Kronecker) outputs edge probabilities.<n>With edge independency, RGMs provably cannot produce high subgraph densities and high output variability simultaneously.<n>We explore RGMs beyond edge independence that can better reproduce common patterns while maintaining high tractability and variability.
arXiv Detail & Related papers (2024-05-26T23:48:30Z) - Sparse Training of Discrete Diffusion Models for Graph Generation [45.103518022696996]
We introduce SparseDiff, a novel diffusion model based on the observation that almost all large graphs are sparse.
By selecting a subset of edges, SparseDiff effectively leverages sparse graph representations both during the noising process and within the denoising network.
Our model demonstrates state-of-the-art performance across multiple metrics on both small and large datasets.
arXiv Detail & Related papers (2023-11-03T16:50:26Z) - Graph Out-of-Distribution Generalization with Controllable Data
Augmentation [51.17476258673232]
Graph Neural Network (GNN) has demonstrated extraordinary performance in classifying graph properties.
Due to the selection bias of training and testing data, distribution deviation is widespread.
We propose OOD calibration to measure the distribution deviation of virtual samples.
arXiv Detail & Related papers (2023-08-16T13:10:27Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - Unveiling the Sampling Density in Non-Uniform Geometric Graphs [69.93864101024639]
We consider graphs as geometric graphs: nodes are randomly sampled from an underlying metric space, and any pair of nodes is connected if their distance is less than a specified neighborhood radius.
In a social network communities can be modeled as densely sampled areas, and hubs as nodes with larger neighborhood radius.
We develop methods to estimate the unknown sampling density in a self-supervised fashion.
arXiv Detail & Related papers (2022-10-15T08:01:08Z) - A Regularized Wasserstein Framework for Graph Kernels [32.558913310384476]
We propose a learning framework for graph kernels grounded on regularizing optimal transport.
This framework provides a novel optimal transport distance metric, namely Regularized Wasserstein (RW) discrepancy.
We have empirically validated our method using 12 datasets against 16 state-of-the-art baselines.
arXiv Detail & Related papers (2021-10-06T07:54:04Z) - Training Robust Graph Neural Networks with Topology Adaptive Edge
Dropping [116.26579152942162]
Graph neural networks (GNNs) are processing architectures that exploit graph structural information to model representations from network data.
Despite their success, GNNs suffer from sub-optimal generalization performance given limited training data.
This paper proposes Topology Adaptive Edge Dropping to improve generalization performance and learn robust GNN models.
arXiv Detail & Related papers (2021-06-05T13:20:36Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Learning non-Gaussian graphical models via Hessian scores and triangular
transport [6.308539010172309]
We propose an algorithm for learning the Markov structure of continuous and non-Gaussian distributions.
Our algorithm SING estimates the density using a deterministic coupling, induced by a triangular transport map, and iteratively exploits sparse structure in the map to reveal sparsity in the graph.
arXiv Detail & Related papers (2021-01-08T16:42:42Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.