Differential geometry with extreme eigenvalues in the positive
semidefinite cone
- URL: http://arxiv.org/abs/2304.07347v2
- Date: Thu, 8 Feb 2024 12:29:42 GMT
- Title: Differential geometry with extreme eigenvalues in the positive
semidefinite cone
- Authors: Cyrus Mostajeran, Natha\"el Da Costa, Graham Van Goffrier, Rodolphe
Sepulchre
- Abstract summary: We present a route to a scalable geometric framework for the analysis and processing of SPD-valued data based on the efficient of extreme generalized eigenvalues.
We define a novel iterative mean of SPD matrices based on this geometry and prove its existence and uniqueness for a given finite collection of points.
- Score: 1.9116784879310025
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Differential geometric approaches to the analysis and processing of data in
the form of symmetric positive definite (SPD) matrices have had notable
successful applications to numerous fields including computer vision, medical
imaging, and machine learning. The dominant geometric paradigm for such
applications has consisted of a few Riemannian geometries associated with
spectral computations that are costly at high scale and in high dimensions. We
present a route to a scalable geometric framework for the analysis and
processing of SPD-valued data based on the efficient computation of extreme
generalized eigenvalues through the Hilbert and Thompson geometries of the
semidefinite cone. We explore a particular geodesic space structure based on
Thompson geometry in detail and establish several properties associated with
this structure. Furthermore, we define a novel iterative mean of SPD matrices
based on this geometry and prove its existence and uniqueness for a given
finite collection of points. Finally, we state and prove a number of desirable
properties that are satisfied by this mean.
Related papers
- Geometric statistics with subspace structure preservation for SPD matrices [1.749935196721634]
We present a framework for the processing of SPD-valued data that preserves subspace structures.
This is achieved through the use of the Thompson geometry of the semidefinite cone.
arXiv Detail & Related papers (2024-07-02T22:22:36Z) - A Survey of Geometric Graph Neural Networks: Data Structures, Models and
Applications [67.33002207179923]
This paper presents a survey of data structures, models, and applications related to geometric GNNs.
We provide a unified view of existing models from the geometric message passing perspective.
We also summarize the applications as well as the related datasets to facilitate later research for methodology development and experimental evaluation.
arXiv Detail & Related papers (2024-03-01T12:13:04Z) - Adaptive Surface Normal Constraint for Geometric Estimation from Monocular Images [56.86175251327466]
We introduce a novel approach to learn geometries such as depth and surface normal from images while incorporating geometric context.
Our approach extracts geometric context that encodes the geometric variations present in the input image and correlates depth estimation with geometric constraints.
Our method unifies depth and surface normal estimations within a cohesive framework, which enables the generation of high-quality 3D geometry from images.
arXiv Detail & Related papers (2024-02-08T17:57:59Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Parametrizing Product Shape Manifolds by Composite Networks [5.772786223242281]
We show that it is possible to learn an efficient neural network approximation for shape spaces with a special product structure.
Our proposed architecture leverages this structure by separately learning approximations for the low-dimensional factors and a subsequent combination.
arXiv Detail & Related papers (2023-02-28T15:31:23Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - Learning with symmetric positive definite matrices via generalized
Bures-Wasserstein geometry [40.23168342389821]
We propose a novel generalization of the Bures-Wasserstein geometry, which we call the GBW geometry.
We provide a rigorous treatment to study various differential geometric notions on the proposed novel generalized geometry.
We also present experiments that illustrate the efficacy of the proposed GBW geometry over the BW geometry.
arXiv Detail & Related papers (2021-10-20T10:03:06Z) - On Riemannian Optimization over Positive Definite Matrices with the
Bures-Wasserstein Geometry [45.1944007785671]
We comparatively analyze the Bures-Wasserstein (BW) geometry with the popular Affine-Invariant (AI) geometry.
We build on an observation that the BW metric has a linear dependence on SPD matrices in contrast to the quadratic dependence of the AI metric.
We show that the BW geometry has a non-negative curvature, which further improves convergence rates of algorithms over the non-positively curved AI geometry.
arXiv Detail & Related papers (2021-06-01T07:39:19Z) - Theoretical bounds on data requirements for the ray-based classification [0.0]
A new classification framework has been proposed in which the intersections of a set of one-dimensional representations, called rays, with the boundaries of the shape are used to identify the specific geometry.
Here, we establish a bound on the number of rays necessary for shape classification, defined by key angular metrics, for arbitrary convex shapes.
This result enables a different approach for estimating high-dimensional shapes using substantially fewer data elements than volumetric or surface-based approaches.
arXiv Detail & Related papers (2021-03-17T11:38:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.