Learning the Structure of Connection Graphs
- URL: http://arxiv.org/abs/2510.11245v1
- Date: Mon, 13 Oct 2025 10:33:31 GMT
- Title: Learning the Structure of Connection Graphs
- Authors: Leonardo Di Nino, Gabriele D'Acunto, Sergio Barbarossa, Paolo Di Lorenzo,
- Abstract summary: Connection graphs (CGs) extend traditional graph models by coupling network topology with transformations, enabling the representation of global geometric consistency.<n>We propose a principled framework based on maximum pseudo-likelihood under a consistency assumption, which enforces spectral properties linking the connection Laplacian to the underlying Laplacian.<n>We introduce the Structured Connection Graph Learning (SCGL) algorithm, a block-optimization procedure that jointly infers network topology, edge weights, and geometric structure.
- Score: 13.687470962704744
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Connection graphs (CGs) extend traditional graph models by coupling network topology with orthogonal transformations, enabling the representation of global geometric consistency. They play a key role in applications such as synchronization, Riemannian signal processing, and neural sheaf diffusion. In this work, we address the inverse problem of learning CGs directly from observed signals. We propose a principled framework based on maximum pseudo-likelihood under a consistency assumption, which enforces spectral properties linking the connection Laplacian to the underlying combinatorial Laplacian. Based on this formulation, we introduce the Structured Connection Graph Learning (SCGL) algorithm, a block-optimization procedure over Riemannian manifolds that jointly infers network topology, edge weights, and geometric structure. Our experiments show that SCGL consistently outperforms existing baselines in both topological recovery and geometric fidelity, while remaining computationally efficient.
Related papers
- The Neural Differential Manifold: An Architecture with Explicit Geometric Structure [8.201374511929538]
This paper introduces the Neural Differential Manifold (NDM), a novel neural network architecture that explicitly incorporates geometric structure into its fundamental design.<n>We analyze the theoretical advantages of this approach, including its potential for more efficient optimization, enhanced continual learning, and applications in scientific discovery and controllable generative modeling.
arXiv Detail & Related papers (2025-10-29T02:24:27Z) - Discrete Functional Geometry of ReLU Networks via ReLU Transition Graphs [0.0]
We extend the ReLU Transition Graph (RTG) framework into a comprehensive graph-theoretic model for understanding deep ReLU networks.<n>In this model, each node represents a linear activation region, and edges connect regions that differ by a single ReLU activation flip.
arXiv Detail & Related papers (2025-09-03T06:38:22Z) - A Remedy for Over-Squashing in Graph Learning via Forman-Ricci Curvature based Graph-to-Hypergraph Structural Lifting [0.0]
We propose a structural lifting strategy using Forman-Ricci curvature, which defines an edge-based network characteristic.<n>Curvature reveals local and global properties of a graph, such as a network's backbones.<n>Our approach provides a remedy to the problem of information distortion in message passing across long distances and graph bottlenecks.
arXiv Detail & Related papers (2025-08-15T10:46:27Z) - Adaptive Riemannian Graph Neural Networks [29.859977834688625]
We introduce a novel framework that learns a continuous and anisotropic metric tensor field over the graph.<n>It allows each node to determine its optimal local geometry, enabling the model to fluidly adapt to the graph's structural landscape.<n>Our method demonstrates superior performance on both homophilic and heterophilic benchmark geometries.
arXiv Detail & Related papers (2025-08-04T16:55:02Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Graph Spectral Embedding using the Geodesic Betweeness Centrality [76.27138343125985]
We introduce the Graph Sylvester Embedding (GSE), an unsupervised graph representation of local similarity, connectivity, and global structure.
GSE uses the solution of the Sylvester equation to capture both network structure and neighborhood proximity in a single representation.
arXiv Detail & Related papers (2022-05-07T04:11:23Z) - Learning Connectivity with Graph Convolutional Networks for
Skeleton-based Action Recognition [14.924672048447338]
We introduce a novel framework for graph convolutional networks that learns the topological properties of graphs.
The design principle of our method is based on the optimization of a constrained objective function.
Experiments conducted on the challenging task of skeleton-based action recognition shows the superiority of the proposed method.
arXiv Detail & Related papers (2021-12-06T19:43:26Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Grid-to-Graph: Flexible Spatial Relational Inductive Biases for
Reinforcement Learning [8.169818701603313]
We show that we can incorporate relational inductive biases, encoded in the form of relational graphs, into agents.
We propose Grid-to-Graph (GTG), a mapping from grid structures to relational graphs that carry useful inductive biases.
We show that GTG produces agents that can jointly reason over observations and environment encoded dynamics in knowledge bases.
arXiv Detail & Related papers (2021-02-08T14:15:13Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.