Bures-Wasserstein Flow Matching for Graph Generation
- URL: http://arxiv.org/abs/2506.14020v3
- Date: Fri, 10 Oct 2025 15:13:41 GMT
- Title: Bures-Wasserstein Flow Matching for Graph Generation
- Authors: Keyue Jiang, Jiahao Cui, Xiaowen Dong, Laura Toni,
- Abstract summary: We present a theoretically grounded framework for probability path construction in graph generative models.<n>We then leverage the optimal transport displacement between MRF objects to design a smooth probability path.<n> BWFlow is a flow-matching framework for graph generation that utilizes the derived optimal probability path.
- Score: 10.21877898746333
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Graph generation has emerged as a critical task in fields ranging from drug discovery to circuit design. Contemporary approaches, notably diffusion and flow-based models, have achieved solid graph generative performance through constructing a probability path that interpolates between reference and data distributions. However, these methods typically model the evolution of individual nodes and edges independently and use linear interpolations to build the path. This disentangled interpolation breaks the interconnected patterns of graphs, making the constructed probability path irregular and non-smooth, which causes poor training dynamics and faulty sampling convergence. To address the limitation, this paper first presents a theoretically grounded framework for probability path construction in graph generative models. Specifically, we model the joint evolution of the nodes and edges by representing graphs as connected systems parameterized by Markov random fields (MRF). We then leverage the optimal transport displacement between MRF objects to design a smooth probability path that ensures the co-evolution of graph components. Based on this, we introduce BWFlow, a flow-matching framework for graph generation that utilizes the derived optimal probability path to benefit the training and sampling algorithm design. Experimental evaluations in plain graph generation and molecule generation validate the effectiveness of BWFlow with competitive performance, better training convergence, and efficient sampling.
Related papers
- GeodesicNVS: Probability Density Geodesic Flow Matching for Novel View Synthesis [54.39598154430305]
We propose a Data-to-Data Flow Matching framework that learns deterministic transformations directly between paired views.<n>PDG-FM constrains flow trajectories using geodesic interpolants derived from probability density metrics of pretrained diffusion models.<n>These results highlight the advantages of incorporating data-dependent geometric regularization into deterministic flow matching for consistent novel view generation.
arXiv Detail & Related papers (2026-03-01T09:30:11Z) - Flowette: Flow Matching with Graphette Priors for Graph Generation [8.684988468368454]
Flowette is a continuous flow matching framework that employs a graph neural network based transformer to learn a velocity field defined over graph representations with node and edge attributes.<n>To incorporate domain driven structural priors, we introduce graphettes, a new probabilistic family of graph structure models that generalize graphons via controlled structural edits for motifs like rings, stars and trees.<n>Flowette demonstrates consistent improvements, highlighting the effectiveness of combining structural priors with flow-based training for modeling complex graph distributions.
arXiv Detail & Related papers (2026-02-27T00:22:21Z) - Adaptive Edge Learning for Density-Aware Graph Generation [0.0]
We propose a density-aware conditional graph generation framework using Wasserstein GANs (WGAN)<n>A differentiable edge predictor determines pairwise relationships directly from node embeddings.<n>A density-aware selection mechanism adaptively controls edge density to match class-specific sparsity distributions.
arXiv Detail & Related papers (2026-01-30T15:01:50Z) - Graph Signal Generative Diffusion Models [74.75869068073577]
We introduce U-shaped encoder-decoder graph neural networks (U-GNNs) for graph signal generation using denoising diffusion processes.<n>The architecture learns node features at different resolutions with skip connections between the encoder and decoder paths.<n>We demonstrate the effectiveness of the diffusion model in probabilistic forecasting of stock prices.
arXiv Detail & Related papers (2025-09-21T21:57:27Z) - A Deep Generative Model for the Simulation of Discrete Karst Networks [0.0]
We use graph generative models to represent karst networks as graphs.<n> nodes retain spatial information and properties, while edges signify connections between nodes.<n>We test our approach using real-world karst networks and compare generated subgraphs with actual subgraphs from the database.
arXiv Detail & Related papers (2025-06-11T15:10:41Z) - Improving Molecular Graph Generation with Flow Matching and Optimal Transport [8.2504828891983]
GGFlow is a discrete flow matching generative model incorporating optimal transport for molecular graphs.
It incorporates an edge-augmented graph transformer to enable the direct communications among chemical bounds.
GGFlow demonstrates superior performance on both unconditional and conditional molecule generation tasks.
arXiv Detail & Related papers (2024-11-08T16:27:27Z) - Wasserstein Flow Matching: Generative modeling over families of distributions [13.620905707751747]
We propose Wasserstein flow matching (WFM), which lifts flow matching onto families of distributions using the Wasserstein geometry.<n> Notably, WFM is the first algorithm capable of generating distributions in high dimensions, whether represented analytically (as Gaussians) or empirically (as point-clouds)
arXiv Detail & Related papers (2024-11-01T15:55:07Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Conditional Diffusion Based on Discrete Graph Structures for Molecular
Graph Generation [32.66694406638287]
We propose a Conditional Diffusion model based on discrete Graph Structures (CDGS) for molecular graph generation.
Specifically, we construct a forward graph diffusion process on both graph structures and inherent features through differential equations (SDE)
We present a specialized hybrid graph noise prediction model that extracts the global context and the local node-edge dependency from intermediate graph states.
arXiv Detail & Related papers (2023-01-01T15:24:15Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - Flow Matching for Generative Modeling [44.66897082688762]
Flow Matching is a simulation-free approach for training Continuous Normalizing Flows (CNFs)
We find that employing FM with diffusion paths results in a more robust and stable alternative for training diffusion models.
Training CNFs using Flow Matching on ImageNet leads to state-of-the-art performance in terms of both likelihood and sample quality.
arXiv Detail & Related papers (2022-10-06T08:32:20Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.