Triangular tensor networks, pencils of matrices and beyond
- URL: http://arxiv.org/abs/2602.15114v1
- Date: Mon, 16 Feb 2026 19:00:04 GMT
- Title: Triangular tensor networks, pencils of matrices and beyond
- Authors: Alessandra Bernardi, Fulvio Gesmundo,
- Abstract summary: We study tensor network varieties associated with the triangular graph, with a focus on the case where one of the physical dimensions is 2.<n>We provide a complete characterization of these varieties in terms of the Kronecker invariants of pencils.
- Score: 45.88028371034407
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study tensor network varieties associated with the triangular graph, with a focus on the case where one of the physical dimensions is 2. This allows us to interpret the tensors as pencils of matrices. We provide a complete characterization of these varieties in terms of the Kronecker invariants of pencils. We determine their dimension, identifying the cases for which the dimension is smaller than the expected parameter count. We provide necessary conditions for membership in these varieties, in terms of the geometry of classical determinantal varieties, coincident root loci and plane cubic curves. We address some extensions to arbitrary graphs.
Related papers
- Geometry of fibers of the multiplication map of deep linear neural networks [0.0]
We study the geometry of the set of quivers of composable matrices which multiply to a fixed matrix.<n>Our solution is presented in three forms: a Poincar'e series in equivariant cohomology, a quadratic integer program, and an explicit formula.
arXiv Detail & Related papers (2024-11-29T18:36:03Z) - On the Geometry and Optimization of Polynomial Convolutional Networks [2.9816332334719773]
We study convolutional neural networks with monomial activation functions.<n>We compute the dimension and the degree of the neuromanifold, which measure the expressivity of the model.<n>For a generic large dataset, we derive an explicit formula that quantifies the number of critical points arising in the optimization of a regression loss.
arXiv Detail & Related papers (2024-10-01T14:13:05Z) - Geometry of Lightning Self-Attention: Identifiability and Dimension [2.9816332334719773]
We study the identifiability of deep attention by providing a description of the generic fibers of the parametrization for an arbitrary number of layers.<n>For a single-layer model, we characterize the singular and boundary points.<n>Finally, we formulate a conjectural extension of our results to normalized self-attention networks, prove it for a single layer, and numerically verify it in the deep case.
arXiv Detail & Related papers (2024-08-30T12:00:36Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Machine learning detects terminal singularities [49.1574468325115]
Q-Fano varieties are positively curved shapes which have Q-factorial terminal singularities.
Despite their importance, the classification of Q-Fano varieties remains unknown.
In this paper we demonstrate that machine learning can be used to understand this classification.
arXiv Detail & Related papers (2023-10-31T13:51:24Z) - Superposed Random Spin Tensor Networks and their Holographic Properties [0.0]
We study boundary-to-boundary holography in a class of spin network states defined by analogy to projected entangled pair states (PEPS)
We consider superpositions of states corresponding to well-defined, discrete geometries on a graph.
arXiv Detail & Related papers (2022-05-19T12:24:57Z) - Dimension of Tensor Network varieties [68.8204255655161]
We determine an upper bound on the dimension of the tensor network variety.
A refined upper bound is given in cases relevant for applications such as varieties of matrix product states and projected entangled pairs states.
arXiv Detail & Related papers (2021-01-08T18:24:50Z) - Primal-Dual Mesh Convolutional Neural Networks [62.165239866312334]
We propose a primal-dual framework drawn from the graph-neural-network literature to triangle meshes.
Our method takes features for both edges and faces of a 3D mesh as input and dynamically aggregates them.
We provide theoretical insights of our approach using tools from the mesh-simplification literature.
arXiv Detail & Related papers (2020-10-23T14:49:02Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.