Riemannian Multinomial Logistics Regression for SPD Neural Networks
- URL: http://arxiv.org/abs/2305.11288v2
- Date: Wed, 20 Mar 2024 15:10:09 GMT
- Title: Riemannian Multinomial Logistics Regression for SPD Neural Networks
- Authors: Ziheng Chen, Yue Song, Gaowen Liu, Ramana Rao Kompella, Xiaojun Wu, Nicu Sebe,
- Abstract summary: We propose a new type of deep neural network for Symmetric Positive Definite (SPD) matrices.
Our framework offers a novel intrinsic explanation for the most popular LogEig classifier in existing SPD networks.
The effectiveness of our method is demonstrated in three applications: radar recognition, human action recognition, and electroencephalography (EEG) classification.
- Score: 60.11063972538648
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Deep neural networks for learning Symmetric Positive Definite (SPD) matrices are gaining increasing attention in machine learning. Despite the significant progress, most existing SPD networks use traditional Euclidean classifiers on an approximated space rather than intrinsic classifiers that accurately capture the geometry of SPD manifolds. Inspired by Hyperbolic Neural Networks (HNNs), we propose Riemannian Multinomial Logistics Regression (RMLR) for the classification layers in SPD networks. We introduce a unified framework for building Riemannian classifiers under the metrics pulled back from the Euclidean space, and showcase our framework under the parameterized Log-Euclidean Metric (LEM) and Log-Cholesky Metric (LCM). Besides, our framework offers a novel intrinsic explanation for the most popular LogEig classifier in existing SPD networks. The effectiveness of our method is demonstrated in three applications: radar recognition, human action recognition, and electroencephalography (EEG) classification. The code is available at https://github.com/GitZH-Chen/SPDMLR.git.
Related papers
- RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - Matrix Manifold Neural Networks++ [18.385670036798707]
We design fully-connected layers for SPD neural networks.
We propose a method for performing backpropagation with the Grassmann logarithmic map in the projector perspective.
arXiv Detail & Related papers (2024-05-29T15:47:35Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - A Lie Group Approach to Riemannian Batch Normalization [59.48083303101632]
This paper establishes a unified framework for normalization techniques on Lie groups.
We focus on Symmetric Positive Definite (SPD), which possess three distinct types of Lie group structures.
Specific normalization layers induced by these Lie groups are then proposed for SPD neural networks.
arXiv Detail & Related papers (2024-03-17T16:24:07Z) - Riemannian Self-Attention Mechanism for SPD Networks [34.794770395408335]
An SPD manifold self-attention mechanism (SMSA) is proposed in this paper.
An SMSA-based geometric learning module (SMSA-GL) is designed for the sake of improving the discrimination of structured representations.
arXiv Detail & Related papers (2023-11-28T12:34:46Z) - Adaptive Log-Euclidean Metrics for SPD Matrix Learning [73.12655932115881]
We propose Adaptive Log-Euclidean Metrics (ALEMs), which extend the widely used Log-Euclidean Metric (LEM)
The experimental and theoretical results demonstrate the merit of the proposed metrics in improving the performance of SPD neural networks.
arXiv Detail & Related papers (2023-03-26T18:31:52Z) - DreamNet: A Deep Riemannian Network based on SPD Manifold Learning for
Visual Classification [36.848148506610364]
We propose a new architecture for SPD matrix learning.
To enrich the deep representations, we adopt SPDNet as the backbone.
We then insert several residual-like blocks with shortcut connections to augment the representational capacity of SRAE.
arXiv Detail & Related papers (2022-06-16T07:15:20Z) - Riemannian Local Mechanism for SPD Neural Networks [43.789561494266316]
We argue that it is of utmost importance to ensure the preservation of local geometric information in SPD networks.
We first analyse the convolution operator commonly used for capturing local information in Euclidean deep networks.
Based on this analysis, we define the local information in the SPD manifold and design a multi-scale submanifold block for mining local geometry.
arXiv Detail & Related papers (2022-01-25T07:39:25Z) - Neural Architecture Search of SPD Manifold Networks [79.45110063435617]
We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
arXiv Detail & Related papers (2020-10-27T18:08:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.