Spatio-Temporal Analysis of Transformer based Architecture for Attention
Estimation from EEG
- URL: http://arxiv.org/abs/2204.07162v1
- Date: Mon, 4 Apr 2022 08:05:33 GMT
- Title: Spatio-Temporal Analysis of Transformer based Architecture for Attention
Estimation from EEG
- Authors: Victor Delvigne, Hazem Wannous, Jean-Philippe Vandeborre, Laurence
Ris, Thierry Dutoit
- Abstract summary: We present a novel framework allowing us to retrieve the attention state, i.e degree of attention given to a specific task, from EEG signals.
While previous methods often consider the spatial relationship in EEG through electrodes, we propose here to also exploit the spatial and temporal information with a transformer-based network.
The proposed network has been trained and validated on two public datasets and achieves higher results compared to state-of-the-art models.
- Score: 2.7076510056452654
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: For many years now, understanding the brain mechanism has been a great
research subject in many different fields. Brain signal processing and
especially electroencephalogram (EEG) has recently known a growing interest
both in academia and industry. One of the main examples is the increasing
number of Brain-Computer Interfaces (BCI) aiming to link brains and computers.
In this paper, we present a novel framework allowing us to retrieve the
attention state, i.e degree of attention given to a specific task, from EEG
signals. While previous methods often consider the spatial relationship in EEG
through electrodes and process them in recurrent or convolutional based
architecture, we propose here to also exploit the spatial and temporal
information with a transformer-based network that has already shown its
supremacy in many machine-learning (ML) related studies, e.g. machine
translation. In addition to this novel architecture, an extensive study on the
feature extraction methods, frequential bands and temporal windows length has
also been carried out. The proposed network has been trained and validated on
two public datasets and achieves higher results compared to state-of-the-art
models. As well as proposing better results, the framework could be used in
real applications, e.g. Attention Deficit Hyperactivity Disorder (ADHD)
symptoms or vigilance during a driving assessment.
Related papers
- vEEGNet: learning latent representations to reconstruct EEG raw data via
variational autoencoders [3.031375888004876]
We propose vEEGNet, a DL architecture with two modules, i.e., an unsupervised module based on variational autoencoders to extract a latent representation of the data, and a supervised module based on a feed-forward neural network to classify different movements.
We show state-of-the-art classification performance, and the ability to reconstruct both low-frequency and middle-range components of the raw EEG.
arXiv Detail & Related papers (2023-11-16T19:24:40Z) - A Comprehensive Survey on Applications of Transformers for Deep Learning
Tasks [60.38369406877899]
Transformer is a deep neural network that employs a self-attention mechanism to comprehend the contextual relationships within sequential data.
transformer models excel in handling long dependencies between input sequence elements and enable parallel processing.
Our survey encompasses the identification of the top five application domains for transformer-based models.
arXiv Detail & Related papers (2023-06-11T23:13:51Z) - A Convolutional Spiking Network for Gesture Recognition in
Brain-Computer Interfaces [0.8122270502556371]
We propose a simple yet efficient machine learning-based approach for the exemplary problem of hand gesture classification based on brain signals.
We demonstrate that this approach generalizes to different subjects with both EEG and ECoG data and achieves superior accuracy in the range of 92.74-97.07%.
arXiv Detail & Related papers (2023-04-21T16:23:40Z) - fMRI from EEG is only Deep Learning away: the use of interpretable DL to
unravel EEG-fMRI relationships [68.8204255655161]
We present an interpretable domain grounded solution to recover the activity of several subcortical regions from multichannel EEG data.
We recover individual spatial and time-frequency patterns of scalp EEG predictive of the hemodynamic signal in the subcortical nuclei.
arXiv Detail & Related papers (2022-10-23T15:11:37Z) - An intertwined neural network model for EEG classification in
brain-computer interfaces [0.6696153817334769]
The brain computer interface (BCI) is a nonstimulatory direct and occasionally bidirectional communication link between the brain and a computer or an external device.
We present a deep neural network architecture specifically engineered to provide state-of-the-art performance in multiclass motor imagery classification.
arXiv Detail & Related papers (2022-08-04T09:00:34Z) - A Deep Learning Approach for the Segmentation of Electroencephalography
Data in Eye Tracking Applications [56.458448869572294]
We introduce DETRtime, a novel framework for time-series segmentation of EEG data.
Our end-to-end deep learning-based framework brings advances in Computer Vision to the forefront.
Our model generalizes well in the task of EEG sleep stage segmentation.
arXiv Detail & Related papers (2022-06-17T10:17:24Z) - EEG-ITNet: An Explainable Inception Temporal Convolutional Network for
Motor Imagery Classification [0.5616884466478884]
We propose an end-to-end deep learning architecture called EEG-ITNet.
Our model can extract rich spectral, spatial, and temporal information from multi-channel EEG signals.
EEG-ITNet shows up to 5.9% improvement in the classification accuracy in different scenarios.
arXiv Detail & Related papers (2022-04-14T13:18:43Z) - Analyzing EEG Data with Machine and Deep Learning: A Benchmark [23.893444154059324]
This paper focuses on EEG signal analysis, and for the first time in literature, a benchmark of machine and deep learning for EEG signal classification.
For our experiments we used the four most widespread models, i.e., multilayer perceptron, convolutional neural network, long short-term memory, and gated recurrent unit.
arXiv Detail & Related papers (2022-03-18T15:18:55Z) - Uncovering the structure of clinical EEG signals with self-supervised
learning [64.4754948595556]
Supervised learning paradigms are often limited by the amount of labeled data that is available.
This phenomenon is particularly problematic in clinically-relevant data, such as electroencephalography (EEG)
By extracting information from unlabeled data, it might be possible to reach competitive performance with deep neural networks.
arXiv Detail & Related papers (2020-07-31T14:34:47Z) - EEG-based Brain-Computer Interfaces (BCIs): A Survey of Recent Studies
on Signal Sensing Technologies and Computational Intelligence Approaches and
their Applications [65.32004302942218]
Brain-Computer Interface (BCI) is a powerful communication tool between users and systems.
Recent technological advances have increased interest in electroencephalographic (EEG) based BCI for translational and healthcare applications.
arXiv Detail & Related papers (2020-01-28T10:36:26Z) - Opportunities and Challenges of Deep Learning Methods for
Electrocardiogram Data: A Systematic Review [62.490310870300746]
The electrocardiogram (ECG) is one of the most commonly used diagnostic tools in medicine and healthcare.
Deep learning methods have achieved promising results on predictive healthcare tasks using ECG signals.
This paper presents a systematic review of deep learning methods for ECG data from both modeling and application perspectives.
arXiv Detail & Related papers (2019-12-28T02:44:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.