FlowSymm: Physics Aware, Symmetry Preserving Graph Attention for Network Flow Completion
- URL: http://arxiv.org/abs/2601.22317v1
- Date: Thu, 29 Jan 2026 20:56:58 GMT
- Title: FlowSymm: Physics Aware, Symmetry Preserving Graph Attention for Network Flow Completion
- Authors: Ege Demirci, Francesco Bullo, Ananthram Swami, Ambuj Singh,
- Abstract summary: FlowSymm is a novel architecture that combines a group-action on divergence-free flows and a graph-attention encoder to learn feature-conditioned weights over these symmetry-preserving actions.<n>It outperforms state-of-the-art baselines in RMSE, MAE and correlation metrics.
- Score: 14.721288177297017
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recovering missing flows on the edges of a network, while exactly respecting local conservation laws, is a fundamental inverse problem that arises in many systems such as transportation, energy, and mobility. We introduce FlowSymm, a novel architecture that combines (i) a group-action on divergence-free flows, (ii) a graph-attention encoder to learn feature-conditioned weights over these symmetry-preserving actions, and (iii) a lightweight Tikhonov refinement solved via implicit bilevel optimization. The method first anchors the given observation on a minimum-norm divergence-free completion. We then compute an orthonormal basis for all admissible group actions that leave the observed flows invariant and parameterize the valid solution subspace, which shows an Abelian group structure under vector addition. A stack of GATv2 layers then encodes the graph and its edge features into per-edge embeddings, which are pooled over the missing edges and produce per-basis attention weights. This attention-guided process selects a set of physics-aware group actions that preserve the observed flows. Finally, a scalar Tikhonov penalty refines the missing entries via a convex least-squares solver, with gradients propagated implicitly through Cholesky factorization. Across three real-world flow benchmarks (traffic, power, bike), FlowSymm outperforms state-of-the-art baselines in RMSE, MAE and correlation metrics.
Related papers
- Entropy-Controlled Flow Matching [0.08460698440162889]
We propose a constrained variational principle over continuity-equation paths enforcing a global entropy-rate budget d/dt H(mu_t) >= -lambda.<n>We obtain certificate-style mode-coverage and density-floor guarantees with Lipschitz, and construct near-optimal counterexamples for unconstrained flow matching.
arXiv Detail & Related papers (2026-02-25T06:07:01Z) - Riemannian Flow Matching for Disentangled Graph Domain Adaptation [51.98961391065951]
Graph Domain Adaptation (GDA) typically uses adversarial learning to align graph embeddings in Euclidean space.<n>DisRFM is a geometry-aware GDA framework that unifies embedding and flow-based transport.
arXiv Detail & Related papers (2026-01-31T11:05:35Z) - Variational Bayesian Flow Network for Graph Generation [54.94088904387278]
We propose Variational Bayesian Flow Network (VBFN) for graph generation.<n>VBFN performs variational lifting to a tractable joint Gaussian variational belief family governed by structured precisions.<n>On synthetic and molecular graph datasets, VBFN improves fidelity and diversity, and surpasses baseline methods.
arXiv Detail & Related papers (2026-01-30T03:59:38Z) - A scalable flow-based approach to mitigate topological freezing [34.54607280864912]
We present a flow-based strategy to remove topological artifacts from Markov Chain Monte Carlo simulations.<n>The strategy is based on a Normalizing Flow (SNF) that alternates non-equilibrium Monte Carlo updates with localized, stout-equivariant defect layers.<n>We show that defect SNFs achieve better performances than reproducing non-equilibrium methods at comparable cost.
arXiv Detail & Related papers (2026-01-28T15:40:46Z) - Terminally constrained flow-based generative models from an optimal control perspective [32.87833798690545]
Terminal Optimal Control with Flow-based models (TOCFlow) is a geometry-aware sampling-time guidance method for pre-trained flows.<n>We show that as the control penalty increases, the controlled process recovers the reference distribution, while as the penalty vanishes, the terminal law converges to a generalized Wasserstein projection onto the constraint manifold.<n>We evaluate TOCFlow on three high-dimensional scientific tasks spanning equality, inequality, and global statistical constraints.
arXiv Detail & Related papers (2026-01-14T13:32:15Z) - Scaling of Stochastic Normalizing Flows in $\mathrm{SU}(3)$ lattice gauge theory [44.99833362998488]
Non-equilibrium Markov Chain Monte Carlo simulations provide a well-understood framework based on Jarzynski's equality to sample from a target probability distribution.<n>Out-of-equilibrium evolutions share the same framework of flow-based approaches and they can be naturally combined into a novel architecture called Normalizing Flows (SNFs)<n>We present the first implementation of SNFs for $mathrmSU(3)$ lattice gauge theory in 4 dimensions, defined by introducing gauge-equivariant layers between out-of-equilibrium Monte Carlo updates.
arXiv Detail & Related papers (2024-11-29T19:01:05Z) - Point Cloud Denoising With Fine-Granularity Dynamic Graph Convolutional Networks [58.050130177241186]
Noise perturbations often corrupt 3-D point clouds, hindering downstream tasks such as surface reconstruction, rendering, and further processing.
This paper introduces finegranularity dynamic graph convolutional networks called GDGCN, a novel approach to denoising in 3-D point clouds.
arXiv Detail & Related papers (2024-11-21T14:19:32Z) - GAFlow: Incorporating Gaussian Attention into Optical Flow [62.646389181507764]
We push Gaussian Attention (GA) into the optical flow models to accentuate local properties during representation learning.
We introduce a novel Gaussian-Constrained Layer (GCL) which can be easily plugged into existing Transformer blocks.
For reliable motion analysis, we provide a new Gaussian-Guided Attention Module (GGAM)
arXiv Detail & Related papers (2023-09-28T07:46:01Z) - Quantum State Assignment Flows [3.7886425043810905]
This paper flows for assignment as state spaces representing data associated with layers of an underlying weighted graph.
The novel approach for data representation and analysis, including the representation of data across the graph by and entanglementization, is presented.
arXiv Detail & Related papers (2023-06-30T18:29:14Z) - GMFlow: Learning Optical Flow via Global Matching [124.57850500778277]
We propose a GMFlow framework for learning optical flow estimation.
It consists of three main components: a customized Transformer for feature enhancement, a correlation and softmax layer for global feature matching, and a self-attention layer for flow propagation.
Our new framework outperforms 32-iteration RAFT's performance on the challenging Sintel benchmark.
arXiv Detail & Related papers (2021-11-26T18:59:56Z) - Assignment Flows for Data Labeling on Graphs: Convergence and Stability [69.68068088508505]
This paper establishes conditions on the weight parameters that guarantee convergence of the continuous-time assignment flow to integral assignments (labelings)
Several counter-examples illustrate that violating the conditions may entail unfavorable behavior of the assignment flow regarding contextual data classification.
arXiv Detail & Related papers (2020-02-26T15:45:38Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.