Prior-Informed Flow Matching for Graph Reconstruction
- URL: http://arxiv.org/abs/2601.22107v1
- Date: Thu, 29 Jan 2026 18:38:02 GMT
- Title: Prior-Informed Flow Matching for Graph Reconstruction
- Authors: Harvey Chen, Nicolas Zilberstein, Santiago Segarra,
- Abstract summary: We introduce Prior-Informed Flow Matching (PIFM), a conditional flow model for graph reconstruction.<n>PIFM bridges the gap by integrating embedding-based priors with continuous-time flow matching.<n> Experiments on different datasets demonstrate that PIFM consistently enhances classical embeddings, outperforming them and state-of-the-art generative baselines in reconstruction accuracy.
- Score: 38.96153306484745
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Prior-Informed Flow Matching (PIFM), a conditional flow model for graph reconstruction. Reconstructing graphs from partial observations remains a key challenge; classical embedding methods often lack global consistency, while modern generative models struggle to incorporate structural priors. PIFM bridges this gap by integrating embedding-based priors with continuous-time flow matching. Grounded in a permutation equivariant version of the distortion-perception theory, our method first uses a prior, such as graphons or GraphSAGE/node2vec, to form an informed initial estimate of the adjacency matrix based on local information. It then applies rectified flow matching to refine this estimate, transporting it toward the true distribution of clean graphs and learning a global coupling. Experiments on different datasets demonstrate that PIFM consistently enhances classical embeddings, outperforming them and state-of-the-art generative baselines in reconstruction accuracy.
Related papers
- GeodesicNVS: Probability Density Geodesic Flow Matching for Novel View Synthesis [54.39598154430305]
We propose a Data-to-Data Flow Matching framework that learns deterministic transformations directly between paired views.<n>PDG-FM constrains flow trajectories using geodesic interpolants derived from probability density metrics of pretrained diffusion models.<n>These results highlight the advantages of incorporating data-dependent geometric regularization into deterministic flow matching for consistent novel view generation.
arXiv Detail & Related papers (2026-03-01T09:30:11Z) - Flowette: Flow Matching with Graphette Priors for Graph Generation [8.684988468368454]
Flowette is a continuous flow matching framework that employs a graph neural network based transformer to learn a velocity field defined over graph representations with node and edge attributes.<n>To incorporate domain driven structural priors, we introduce graphettes, a new probabilistic family of graph structure models that generalize graphons via controlled structural edits for motifs like rings, stars and trees.<n>Flowette demonstrates consistent improvements, highlighting the effectiveness of combining structural priors with flow-based training for modeling complex graph distributions.
arXiv Detail & Related papers (2026-02-27T00:22:21Z) - Variational Bayesian Flow Network for Graph Generation [54.94088904387278]
We propose Variational Bayesian Flow Network (VBFN) for graph generation.<n>VBFN performs variational lifting to a tractable joint Gaussian variational belief family governed by structured precisions.<n>On synthetic and molecular graph datasets, VBFN improves fidelity and diversity, and surpasses baseline methods.
arXiv Detail & Related papers (2026-01-30T03:59:38Z) - On the Closed-Form of Flow Matching: Generalization Does Not Arise from Target Stochasticity [9.174891098192951]
We investigate why diffusion and flow matching techniques generalize so effectively.<n>We show that in high-dimensional settings, the generalization and closed-form versions of the flow matching loss yield nearly equivalent losses.
arXiv Detail & Related papers (2025-06-04T08:50:32Z) - Solving Inverse Problems with FLAIR [68.87167940623318]
We present FLAIR, a training-free variational framework that leverages flow-based generative models as prior for inverse problems.<n>Results on standard imaging benchmarks demonstrate that FLAIR consistently outperforms existing diffusion- and flow-based methods in terms of reconstruction quality and sample diversity.
arXiv Detail & Related papers (2025-06-03T09:29:47Z) - Theoretical Foundation of Flow-Based Time Series Generation: Provable Approximation, Generalization, and Efficiency [7.578147116161996]
This paper presents the first theoretical framework from the perspective of flow-based generative models.<n>By assuming a general data model, the fitting of the flow-based generative models is confirmed to converge to arbitrary error under the universal approximation of Diffusion Transformer (DiT)
arXiv Detail & Related papers (2025-03-18T09:53:48Z) - Sub-graph Based Diffusion Model for Link Prediction [43.15741675617231]
Denoising Diffusion Probabilistic Models (DDPMs) represent a contemporary class of generative models with exceptional qualities.
We build a novel generative model for link prediction using a dedicated design to decompose the likelihood estimation process via the Bayesian formula.
Our proposed method presents numerous advantages: (1) transferability across datasets without retraining, (2) promising generalization on limited training data, and (3) robustness against graph adversarial attacks.
arXiv Detail & Related papers (2024-09-13T02:23:55Z) - Revealing Decurve Flows for Generalized Graph Propagation [108.80758541147418]
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining em textbfgeneralized propagation with directed and weighted graphs.
We include a preliminary exploration of learned propagation patterns in datasets, a first in the field.
arXiv Detail & Related papers (2024-02-13T14:13:17Z) - Supercharging Graph Transformers with Advective Diffusion [28.40109111316014]
This paper proposes Advective Diffusion Transformer (AdvDIFFormer), a physics-inspired graph Transformer model designed to address this challenge.<n>We show that AdvDIFFormer has provable capability for controlling generalization error with topological shifts.<n> Empirically, the model demonstrates superiority in various predictive tasks across information networks, molecular screening and protein interactions.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Bayesian Structure Learning with Generative Flow Networks [85.84396514570373]
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) from data.
Recently, a class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling.
We show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs.
arXiv Detail & Related papers (2022-02-28T15:53:10Z) - Graphical Normalizing Flows [11.23030807455021]
Normalizing flows model complex probability distributions by combining a base distribution with a series of neural networks.
State-of-the-art architectures rely on coupling and autoregressive transformations to lift up invertible functions from scalars to vectors.
We propose the graphical normalizing flow, a new invertible transformation with either a prescribed or a learnable graphical structure.
arXiv Detail & Related papers (2020-06-03T21:50:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.