Structure-Preserving Transformers for Sequences of SPD Matrices
- URL: http://arxiv.org/abs/2309.07579v7
- Date: Tue, 28 May 2024 13:49:51 GMT
- Title: Structure-Preserving Transformers for Sequences of SPD Matrices
- Authors: Mathieu Seraphim, Alexis Lechervy, Florian Yger, Luc Brun, Olivier Etard,
- Abstract summary: Transformer-based auto-attention mechanisms have been successfully applied to the analysis of a variety of context-reliant data types.
In this paper, we present such a mechanism, designed to classify sequences of Symmetric Positive Definite matrices.
We apply our method to automatic sleep staging on timeseries of EEG-derived covariance matrices from a standard dataset, obtaining high levels of stage-wise performance.
- Score: 6.404789669795639
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In recent years, Transformer-based auto-attention mechanisms have been successfully applied to the analysis of a variety of context-reliant data types, from texts to images and beyond, including data from non-Euclidean geometries. In this paper, we present such a mechanism, designed to classify sequences of Symmetric Positive Definite matrices while preserving their Riemannian geometry throughout the analysis. We apply our method to automatic sleep staging on timeseries of EEG-derived covariance matrices from a standard dataset, obtaining high levels of stage-wise performance.
Related papers
- Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - Weakly supervised covariance matrices alignment through Stiefel matrices
estimation for MEG applications [64.20396555814513]
This paper introduces a novel domain adaptation technique for time series data, called Mixing model Stiefel Adaptation (MSA)
We exploit abundant unlabeled data in the target domain to ensure effective prediction by establishing pairwise correspondence with equivalent signal variances between domains.
MSA outperforms recent methods in brain-age regression with task variations using magnetoencephalography (MEG) signals from the Cam-CAN dataset.
arXiv Detail & Related papers (2024-01-24T19:04:49Z) - Identifying Systems with Symmetries using Equivariant Autoregressive
Reservoir Computers [0.0]
Investigation focuses on identifying systems with symmetries using equivariant autoregressive reservoir computers.
General results in structured matrix approximation theory are presented, exploring a two-fold approach.
arXiv Detail & Related papers (2023-11-16T02:32:26Z) - Accelerated Discovery of Machine-Learned Symmetries: Deriving the
Exceptional Lie Groups G2, F4 and E6 [55.41644538483948]
This letter introduces two improved algorithms that significantly speed up the discovery of symmetry transformations.
Given the significant complexity of the exceptional Lie groups, our results demonstrate that this machine-learning method for discovering symmetries is completely general and can be applied to a wide variety of labeled datasets.
arXiv Detail & Related papers (2023-07-10T20:25:44Z) - Sliced-Wasserstein on Symmetric Positive Definite Matrices for M/EEG
Signals [24.798859309715667]
We propose a new method to deal with distributions of covariance matrices.
We show that it is an efficient surrogate to the Wasserstein distance in domain adaptation for Brain Computer Interface applications.
arXiv Detail & Related papers (2023-03-10T09:08:46Z) - Classification of BCI-EEG based on augmented covariance matrix [0.0]
We propose a new framework based on the augmented covariance extracted from an autoregressive model to improve motor imagery classification.
We will test our approach on several datasets and several subjects using the MOABB framework.
arXiv Detail & Related papers (2023-02-09T09:04:25Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Adversarially-Trained Nonnegative Matrix Factorization [77.34726150561087]
We consider an adversarially-trained version of the nonnegative matrix factorization.
In our formulation, an attacker adds an arbitrary matrix of bounded norm to the given data matrix.
We design efficient algorithms inspired by adversarial training to optimize for dictionary and coefficient matrices.
arXiv Detail & Related papers (2021-04-10T13:13:17Z) - $k$-means on Positive Definite Matrices, and an Application to
Clustering in Radar Image Sequences [0.0]
We state theoretical properties for $k$-means clustering of Symmetric Positive Definite (SPD) matrices, in a non-Euclidean space.
We then provide a novel application for this method, to time-series clustering of pixels in a sequence of Synthetic Aperture Radar images.
arXiv Detail & Related papers (2020-08-08T06:21:43Z) - Asymptotic Analysis of an Ensemble of Randomly Projected Linear
Discriminants [94.46276668068327]
In [1], an ensemble of randomly projected linear discriminants is used to classify datasets.
We develop a consistent estimator of the misclassification probability as an alternative to the computationally-costly cross-validation estimator.
We also demonstrate the use of our estimator for tuning the projection dimension on both real and synthetic data.
arXiv Detail & Related papers (2020-04-17T12:47:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.