Navigation through Non-Compact Symmetric Spaces: a mathematical perspective on Cartan Neural Networks
- URL: http://arxiv.org/abs/2507.16871v1
- Date: Tue, 22 Jul 2025 07:34:53 GMT
- Title: Navigation through Non-Compact Symmetric Spaces: a mathematical perspective on Cartan Neural Networks
- Authors: Pietro Giuseppe Fré, Federico Milanesio, Guido Sanguinetti, Matteo Santoro,
- Abstract summary: An initial implementation of these concepts has been presented in a twin paper under the moniker of Cartan Neural Networks.<n>The current paper expands on the mathematical structures underpinning Cartan Neural Networks, detailing the geometric properties of the layers.<n>Together, these papers constitute a first step towards a fully geometrically interpretable theory of neural networks exploiting group-theoretic structures.
- Score: 0.3749861135832073
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent work has identified non-compact symmetric spaces U/H as a promising class of homogeneous manifolds to develop a geometrically consistent theory of neural networks. An initial implementation of these concepts has been presented in a twin paper under the moniker of Cartan Neural Networks, showing both the feasibility and the performance of these geometric concepts in a machine learning context. The current paper expands on the mathematical structures underpinning Cartan Neural Networks, detailing the geometric properties of the layers and how the maps between layers interact with such structures to make Cartan Neural Networks covariant and geometrically interpretable. Together, these twin papers constitute a first step towards a fully geometrically interpretable theory of neural networks exploiting group-theoretic structures
Related papers
- Representation Learning of Geometric Trees [9.280083998326285]
We introduce a new representation learning framework tailored for geometric trees.
It first features a unique message passing neural network, which is both provably geometrical structure-recoverable and rotation-translation invariant.
We validate our method's effectiveness on eight real-world datasets, demonstrating its capability to represent geometric trees.
arXiv Detail & Related papers (2024-08-16T15:16:35Z) - Tropical Expressivity of Neural Networks [0.0]
We use tropical geometry to characterize and study various architectural aspects of neural networks.
We present a new algorithm that computes the exact number of their linear regions.
arXiv Detail & Related papers (2024-05-30T15:45:03Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - A singular Riemannian Geometry Approach to Deep Neural Networks III. Piecewise Differentiable Layers and Random Walks on $n$-dimensional Classes [49.32130498861987]
We study the case of non-differentiable activation functions, such as ReLU.
Two recent works introduced a geometric framework to study neural networks.
We illustrate our findings with some numerical experiments on classification of images and thermodynamic problems.
arXiv Detail & Related papers (2024-04-09T08:11:46Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Riemannian Residual Neural Networks [58.925132597945634]
We show how to extend the residual neural network (ResNet)
ResNets have become ubiquitous in machine learning due to their beneficial learning properties, excellent empirical results, and easy-to-incorporate nature when building varied neural networks.
arXiv Detail & Related papers (2023-10-16T02:12:32Z) - A Structural Approach to the Design of Domain Specific Neural Network
Architectures [0.0]
This thesis aims to provide a theoretical evaluation of geometric deep learning.
It compiles theoretical results that characterize the properties of invariant neural networks with respect to learning performance.
arXiv Detail & Related papers (2023-01-23T11:50:57Z) - A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes [78.120734120667]
We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
arXiv Detail & Related papers (2021-12-17T11:47:45Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.