Adaptive Log-Euclidean Metrics for SPD Matrix Learning
- URL: http://arxiv.org/abs/2303.15477v5
- Date: Thu, 29 Aug 2024 17:20:14 GMT
- Title: Adaptive Log-Euclidean Metrics for SPD Matrix Learning
- Authors: Ziheng Chen, Yue Song, Tianyang Xu, Zhiwu Huang, Xiao-Jun Wu, Nicu Sebe,
- Abstract summary: We propose Adaptive Log-Euclidean Metrics (ALEMs), which extend the widely used Log-Euclidean Metric (LEM)
The experimental and theoretical results demonstrate the merit of the proposed metrics in improving the performance of SPD neural networks.
- Score: 73.12655932115881
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Symmetric Positive Definite (SPD) matrices have received wide attention in machine learning due to their intrinsic capacity to encode underlying structural correlation in data. Many successful Riemannian metrics have been proposed to reflect the non-Euclidean geometry of SPD manifolds. However, most existing metric tensors are fixed, which might lead to sub-optimal performance for SPD matrix learning, especially for deep SPD neural networks. To remedy this limitation, we leverage the commonly encountered pullback techniques and propose Adaptive Log-Euclidean Metrics (ALEMs), which extend the widely used Log-Euclidean Metric (LEM). Compared with the previous Riemannian metrics, our metrics contain learnable parameters, which can better adapt to the complex dynamics of Riemannian neural networks with minor extra computations. We also present a complete theoretical analysis to support our ALEMs, including algebraic and Riemannian properties. The experimental and theoretical results demonstrate the merit of the proposed metrics in improving the performance of SPD neural networks. The efficacy of our metrics is further showcased on a set of recently developed Riemannian building blocks, including Riemannian batch normalization, Riemannian Residual blocks, and Riemannian classifiers.
Related papers
- RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - The Role of Fibration Symmetries in Geometric Deep Learning [0.0]
Geometric Deep Learning (GDL) unifies a broad class of machine learning techniques from the perspectives of symmetries.
We propose to relax GDL to allow for local symmetries, specifically fibration symmetries in graphs, to leverage regularities of realistic instances.
GNNs apply the inductive bias of fibration symmetries and derive a tighter upper bound for their expressive power.
arXiv Detail & Related papers (2024-08-28T16:04:40Z) - Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - Riemannian Self-Attention Mechanism for SPD Networks [34.794770395408335]
An SPD manifold self-attention mechanism (SMSA) is proposed in this paper.
An SMSA-based geometric learning module (SMSA-GL) is designed for the sake of improving the discrimination of structured representations.
arXiv Detail & Related papers (2023-11-28T12:34:46Z) - Riemannian Multinomial Logistics Regression for SPD Neural Networks [60.11063972538648]
We propose a new type of deep neural network for Symmetric Positive Definite (SPD) matrices.
Our framework offers a novel intrinsic explanation for the most popular LogEig classifier in existing SPD networks.
The effectiveness of our method is demonstrated in three applications: radar recognition, human action recognition, and electroencephalography (EEG) classification.
arXiv Detail & Related papers (2023-05-18T20:12:22Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - On Riemannian Optimization over Positive Definite Matrices with the
Bures-Wasserstein Geometry [45.1944007785671]
We comparatively analyze the Bures-Wasserstein (BW) geometry with the popular Affine-Invariant (AI) geometry.
We build on an observation that the BW metric has a linear dependence on SPD matrices in contrast to the quadratic dependence of the AI metric.
We show that the BW geometry has a non-negative curvature, which further improves convergence rates of algorithms over the non-positively curved AI geometry.
arXiv Detail & Related papers (2021-06-01T07:39:19Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.