The Geometry of Machine Learning Models
- URL: http://arxiv.org/abs/2508.02080v1
- Date: Mon, 04 Aug 2025 05:45:52 GMT
- Title: The Geometry of Machine Learning Models
- Authors: Pawel Gajer, Jacques Ravel,
- Abstract summary: This paper presents a framework for analyzing machine learning models through the geometry of their induced partitions.<n>For neural networks, we introduce a differential forms approach that tracks geometric structure through layers via pullback operations.<n>While focused on mathematical foundations, this geometric perspective offers new approaches to model interpretation, regularization, and diagnostic tools for understanding learning dynamics.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a mathematical framework for analyzing machine learning models through the geometry of their induced partitions. By representing partitions as Riemannian simplicial complexes, we capture not only adjacency relationships but also geometric properties including cell volumes, volumes of faces where cells meet, and dihedral angles between adjacent cells. For neural networks, we introduce a differential forms approach that tracks geometric structure through layers via pullback operations, making computations tractable by focusing on data-containing cells. The framework enables geometric regularization that directly penalizes problematic spatial configurations and provides new tools for model refinement through extended Laplacians and simplicial splines. We also explore how data distribution induces effective geometric curvature in model partitions, developing discrete curvature measures for vertices that quantify local geometric complexity and statistical Ricci curvature for edges that captures pairwise relationships between cells. While focused on mathematical foundations, this geometric perspective offers new approaches to model interpretation, regularization, and diagnostic tools for understanding learning dynamics.
Related papers
- Geometric Operator Learning with Optimal Transport [77.16909146519227]
We propose integrating optimal transport (OT) into operator learning for partial differential equations (PDEs) on complex geometries.<n>For 3D simulations focused on surfaces, our OT-based neural operator embeds the surface geometry into a 2D parameterized latent space.<n> Experiments with Reynolds-averaged Navier-Stokes equations (RANS) on the ShapeNet-Car and DrivAerNet-Car datasets show that our method achieves better accuracy and also reduces computational expenses.
arXiv Detail & Related papers (2025-07-26T21:28:25Z) - Geometric Embedding Alignment via Curvature Matching in Transfer Learning [4.739852004969771]
We introduce a novel approach to integrate multiple models into a unified transfer learning framework.<n>By aligning the Ricci curvature of latent space of individual models, we construct an interrelated architecture.<n>This framework enables the effective aggregation of knowledge from diverse sources, thereby improving performance on target tasks.
arXiv Detail & Related papers (2025-06-16T00:54:22Z) - SVarM: Linear Support Varifold Machines for Classification and Regression on Geometric Data [4.212663349859165]
This work proposes SVarM to exploit varifold representations of shapes as measures and their duality with test functions.<n>We develop classification and regression models on shape datasets by introducing a neural network-based representation of the trainable test function.
arXiv Detail & Related papers (2025-06-01T21:55:15Z) - Flexible Mesh Segmentation via Reeb Graph Representation of Geometrical and Topological Features [0.0]
This paper presents a new mesh segmentation method that integrates geometrical and topological features through a flexible Reeb graph representation.<n>The algorithm consists of three phases: construction of the Reeb graph using the improved topological skeleton approach, topological simplification of the graph by cancelling critical points while preserving essential features, and generation of contiguous segments via an adaptive region-growth process.
arXiv Detail & Related papers (2024-12-05T23:04:45Z) - Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and Applications [71.809127869349]
This paper formalizes geometric graph as the data structure, on top of which we provide a unified view of existing models from the geometric message passing perspective.<n>We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Towards a mathematical understanding of learning from few examples with
nonlinear feature maps [68.8204255655161]
We consider the problem of data classification where the training set consists of just a few data points.
We reveal key relationships between the geometry of an AI model's feature space, the structure of the underlying data distributions, and the model's generalisation capabilities.
arXiv Detail & Related papers (2022-11-07T14:52:58Z) - Machine Learning Statistical Gravity from Multi-Region Entanglement
Entropy [0.0]
Ryu-Takayanagi formula connects quantum entanglement and geometry.
We propose a microscopic model by superimposing entanglement features of an ensemble of random tensor networks of different bond dimensions.
We show mutual information can be mediated effectively by geometric fluctuation.
arXiv Detail & Related papers (2021-10-03T22:46:41Z) - GeoMol: Torsional Geometric Generation of Molecular 3D Conformer
Ensembles [60.12186997181117]
Prediction of a molecule's 3D conformer ensemble from the molecular graph holds a key role in areas of cheminformatics and drug discovery.
Existing generative models have several drawbacks including lack of modeling important molecular geometry elements.
We propose GeoMol, an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate 3D conformers.
arXiv Detail & Related papers (2021-06-08T14:17:59Z) - Hermitian Symmetric Spaces for Graph Embeddings [0.0]
We learn continuous representations of graphs in spaces of symmetric matrices over C.
These spaces offer a rich geometry that simultaneously admits hyperbolic and Euclidean subspaces.
The proposed models are able to automatically adapt to very dissimilar arrangements without any apriori estimates of graph features.
arXiv Detail & Related papers (2021-05-11T18:14:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.