Geometric statistics with subspace structure preservation for SPD matrices
- URL: http://arxiv.org/abs/2407.03382v1
- Date: Tue, 2 Jul 2024 22:22:36 GMT
- Title: Geometric statistics with subspace structure preservation for SPD matrices
- Authors: Cyrus Mostajeran, Nathaƫl Da Costa, Graham Van Goffrier, Rodolphe Sepulchre,
- Abstract summary: We present a framework for the processing of SPD-valued data that preserves subspace structures.
This is achieved through the use of the Thompson geometry of the semidefinite cone.
- Score: 1.749935196721634
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: We present a geometric framework for the processing of SPD-valued data that preserves subspace structures and is based on the efficient computation of extreme generalized eigenvalues. This is achieved through the use of the Thompson geometry of the semidefinite cone. We explore a particular geodesic space structure in detail and establish several properties associated with it. Finally, we review a novel inductive mean of SPD matrices based on this geometry.
Related papers
- RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - Product Geometries on Cholesky Manifolds with Applications to SPD Manifolds [65.04845593770727]
We present two new metrics on the Symmetric Positive Definite (SPD) manifold via the Cholesky manifold.
Our metrics are easy to use, computationally efficient, and numerically stable.
arXiv Detail & Related papers (2024-07-02T18:46:13Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Structure-Preserving Transformers for Sequences of SPD Matrices [6.404789669795639]
Transformer-based auto-attention mechanisms have been successfully applied to the analysis of a variety of context-reliant data types.
In this paper, we present such a mechanism, designed to classify sequences of Symmetric Positive Definite matrices.
We apply our method to automatic sleep staging on timeseries of EEG-derived covariance matrices from a standard dataset, obtaining high levels of stage-wise performance.
arXiv Detail & Related papers (2023-09-14T10:23:43Z) - Differential geometry with extreme eigenvalues in the positive
semidefinite cone [1.9116784879310025]
We present a route to a scalable geometric framework for the analysis and processing of SPD-valued data based on the efficient of extreme generalized eigenvalues.
We define a novel iterative mean of SPD matrices based on this geometry and prove its existence and uniqueness for a given finite collection of points.
arXiv Detail & Related papers (2023-04-14T18:37:49Z) - Adaptive Log-Euclidean Metrics for SPD Matrix Learning [73.12655932115881]
We propose Adaptive Log-Euclidean Metrics (ALEMs), which extend the widely used Log-Euclidean Metric (LEM)
The experimental and theoretical results demonstrate the merit of the proposed metrics in improving the performance of SPD neural networks.
arXiv Detail & Related papers (2023-03-26T18:31:52Z) - Parametrizing Product Shape Manifolds by Composite Networks [5.772786223242281]
We show that it is possible to learn an efficient neural network approximation for shape spaces with a special product structure.
Our proposed architecture leverages this structure by separately learning approximations for the low-dimensional factors and a subsequent combination.
arXiv Detail & Related papers (2023-02-28T15:31:23Z) - Riemannian Local Mechanism for SPD Neural Networks [43.789561494266316]
We argue that it is of utmost importance to ensure the preservation of local geometric information in SPD networks.
We first analyse the convolution operator commonly used for capturing local information in Euclidean deep networks.
Based on this analysis, we define the local information in the SPD manifold and design a multi-scale submanifold block for mining local geometry.
arXiv Detail & Related papers (2022-01-25T07:39:25Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Inductive Geometric Matrix Midranges [1.2891210250935146]
We propose a geometric method for unsupervised clustering of SPD data based on the Thompson metric.
We demonstrate the incorporation of the Thompson metric and inductive midrange into X-means and K-means++ clustering algorithms.
arXiv Detail & Related papers (2020-06-02T10:18:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.