Adaptive Riemannian Metrics on SPD Manifolds
- URL: http://arxiv.org/abs/2303.15477v3
- Date: Thu, 18 May 2023 20:09:34 GMT
- Title: Adaptive Riemannian Metrics on SPD Manifolds
- Authors: Ziheng Chen, Yue Song, Tianyang Xu, Zhiwu Huang, Xiao-Jun Wu, Nicu
Sebe
- Abstract summary: Symmetric Positive Definite (SPD) matrices have received wide attention in machine learning due to their intrinsic capacity of encoding underlying structural correlation in data.
Existing fixed metric tensors might lead to sub-optimal performance for SPD matrices learning, especially for SPD neural networks.
- Score: 67.48576298756996
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Symmetric Positive Definite (SPD) matrices have received wide attention in
machine learning due to their intrinsic capacity of encoding underlying
structural correlation in data. To reflect the non-Euclidean geometry of SPD
manifolds, many successful Riemannian metrics have been proposed. However,
existing fixed metric tensors might lead to sub-optimal performance for SPD
matrices learning, especially for SPD neural networks. To remedy this
limitation, we leverage the idea of pullback and propose adaptive Riemannian
metrics for SPD manifolds. Moreover, we present comprehensive theories for our
metrics. Experiments on three datasets demonstrate that equipped with the
proposed metrics, SPD networks can exhibit superior performance.
Related papers
- Learning to Normalize on the SPD Manifold under Bures-Wasserstein Geometry [11.846361701184254]
Covariance matrices have proven highly effective across many scientific fields.
The primary challenge in representation learning is to respect this underlying geometric structure.
We propose a novel RBN algorithm based on the Bures-Wasserstein metric, incorporating a learnable metric parameter.
arXiv Detail & Related papers (2025-04-01T11:12:58Z) - RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - The Role of Fibration Symmetries in Geometric Deep Learning [0.0]
Geometric Deep Learning (GDL) unifies a broad class of machine learning techniques from the perspectives of symmetries.
We propose to relax GDL to allow for local symmetries, specifically fibration symmetries in graphs, to leverage regularities of realistic instances.
GNNs apply the inductive bias of fibration symmetries and derive a tighter upper bound for their expressive power.
arXiv Detail & Related papers (2024-08-28T16:04:40Z) - Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - Riemannian Self-Attention Mechanism for SPD Networks [34.794770395408335]
An SPD manifold self-attention mechanism (SMSA) is proposed in this paper.
An SMSA-based geometric learning module (SMSA-GL) is designed for the sake of improving the discrimination of structured representations.
arXiv Detail & Related papers (2023-11-28T12:34:46Z) - Riemannian Multinomial Logistics Regression for SPD Neural Networks [60.11063972538648]
We propose a new type of deep neural network for Symmetric Positive Definite (SPD) matrices.
Our framework offers a novel intrinsic explanation for the most popular LogEig classifier in existing SPD networks.
The effectiveness of our method is demonstrated in three applications: radar recognition, human action recognition, and electroencephalography (EEG) classification.
arXiv Detail & Related papers (2023-05-18T20:12:22Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - On Riemannian Optimization over Positive Definite Matrices with the
Bures-Wasserstein Geometry [45.1944007785671]
We comparatively analyze the Bures-Wasserstein (BW) geometry with the popular Affine-Invariant (AI) geometry.
We build on an observation that the BW metric has a linear dependence on SPD matrices in contrast to the quadratic dependence of the AI metric.
We show that the BW geometry has a non-negative curvature, which further improves convergence rates of algorithms over the non-positively curved AI geometry.
arXiv Detail & Related papers (2021-06-01T07:39:19Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.