4D Attention-based Neural Network for EEG Emotion Recognition
- URL: http://arxiv.org/abs/2101.05484v1
- Date: Thu, 14 Jan 2021 07:41:48 GMT
- Title: 4D Attention-based Neural Network for EEG Emotion Recognition
- Authors: Guowen Xiao, Mengwen Ye, Bowen Xu, Zhendi Chen, Quansheng Ren
- Abstract summary: We present a novel method, called four-dimensional attention-based neural network (4D-aNN) for EEG emotion recognition.
The proposed 4D-aNN adopts spectral and spatial attention mechanisms to adaptively assign the weights of different brain regions and frequency bands.
Our model achieves state-of-the-art performance on the SEED dataset under intra-subject splitting.
- Score: 1.5749416770494706
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Electroencephalograph (EEG) emotion recognition is a significant task in the
brain-computer interface field. Although many deep learning methods are
proposed recently, it is still challenging to make full use of the information
contained in different domains of EEG signals. In this paper, we present a
novel method, called four-dimensional attention-based neural network (4D-aNN)
for EEG emotion recognition. First, raw EEG signals are transformed into 4D
spatial-spectral-temporal representations. Then, the proposed 4D-aNN adopts
spectral and spatial attention mechanisms to adaptively assign the weights of
different brain regions and frequency bands, and a convolutional neural network
(CNN) is utilized to deal with the spectral and spatial information of the 4D
representations. Moreover, a temporal attention mechanism is integrated into a
bidirectional Long Short-Term Memory (LSTM) to explore temporal dependencies of
the 4D representations. Our model achieves state-of-the-art performance on the
SEED dataset under intra-subject splitting. The experimental results have shown
the effectiveness of the attention mechanisms in different domains for EEG
emotion recognition.
Related papers
- Multi-modal Mood Reader: Pre-trained Model Empowers Cross-Subject Emotion Recognition [23.505616142198487]
We develop a Pre-trained model based Multimodal Mood Reader for cross-subject emotion recognition.
The model learns universal latent representations of EEG signals through pre-training on large scale dataset.
Extensive experiments on public datasets demonstrate Mood Reader's superior performance in cross-subject emotion recognition tasks.
arXiv Detail & Related papers (2024-05-28T14:31:11Z) - Dynamic GNNs for Precise Seizure Detection and Classification from EEG Data [6.401370088497331]
This paper introduces NeuroGNN, a dynamic Graph Neural Network (GNN) framework that captures the interplay between the EEG locations and the semantics of their corresponding brain regions.
Our experiments with real-world data demonstrate that NeuroGNN significantly outperforms existing state-of-the-art models.
arXiv Detail & Related papers (2024-05-08T21:36:49Z) - A Knowledge-Driven Cross-view Contrastive Learning for EEG
Representation [48.85731427874065]
This paper proposes a knowledge-driven cross-view contrastive learning framework (KDC2) to extract effective representations from EEG with limited labels.
The KDC2 method creates scalp and neural views of EEG signals, simulating the internal and external representation of brain activity.
By modeling prior neural knowledge based on neural information consistency theory, the proposed method extracts invariant and complementary neural knowledge to generate combined representations.
arXiv Detail & Related papers (2023-09-21T08:53:51Z) - Improving EEG-based Emotion Recognition by Fusing Time-frequency And
Spatial Representations [29.962519978925236]
We propose a classification network based on the cross-domain feature fusion method.
We also propose a two-step fusion method and apply these methods to the EEG emotion recognition network.
Experimental results show that our proposed network, which combines multiple representations in the time-frequency domain and spatial domain, outperforms previous methods on public datasets.
arXiv Detail & Related papers (2023-03-14T07:26:51Z) - fMRI from EEG is only Deep Learning away: the use of interpretable DL to
unravel EEG-fMRI relationships [68.8204255655161]
We present an interpretable domain grounded solution to recover the activity of several subcortical regions from multichannel EEG data.
We recover individual spatial and time-frequency patterns of scalp EEG predictive of the hemodynamic signal in the subcortical nuclei.
arXiv Detail & Related papers (2022-10-23T15:11:37Z) - Positional-Spectral-Temporal Attention in 3D Convolutional Neural
Networks for EEG Emotion Recognition [5.995842375590237]
We propose a novel structure to explore the informative EEG features for emotion recognition.
The PST-Attention module consists of Positional, Spectral and Temporal Attention modules.
Our method is adaptive as well as efficient which can be fit into 3D Convolutional Neural Networks (3D-CNN) as a plug-in module.
arXiv Detail & Related papers (2021-10-13T12:03:36Z) - SFE-Net: EEG-based Emotion Recognition with Symmetrical Spatial Feature
Extraction [1.8047694351309205]
We present a spatial folding ensemble network (SFENet) for EEG feature extraction and emotion recognition.
Motivated by the spatial symmetry mechanism of human brain, we fold the input EEG channel data with five different symmetrical strategies.
With this network, the spatial features of different symmetric folding signlas can be extracted simultaneously, which greatly improves the robustness and accuracy of feature recognition.
arXiv Detail & Related papers (2021-04-09T12:59:38Z) - Emotional EEG Classification using Connectivity Features and
Convolutional Neural Networks [81.74442855155843]
We introduce a new classification system that utilizes brain connectivity with a CNN and validate its effectiveness via the emotional video classification.
The level of concentration of the brain connectivity related to the emotional property of the target video is correlated with classification performance.
arXiv Detail & Related papers (2021-01-18T13:28:08Z) - Continuous Emotion Recognition with Spatiotemporal Convolutional Neural
Networks [82.54695985117783]
We investigate the suitability of state-of-the-art deep learning architectures for continuous emotion recognition using long video sequences captured in-the-wild.
We have developed and evaluated convolutional recurrent neural networks combining 2D-CNNs and long short term-memory units, and inflated 3D-CNN models, which are built by inflating the weights of a pre-trained 2D-CNN model during fine-tuning.
arXiv Detail & Related papers (2020-11-18T13:42:05Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z) - 4D Spatio-Temporal Deep Learning with 4D fMRI Data for Autism Spectrum
Disorder Classification [69.62333053044712]
We propose a 4D convolutional deep learning approach for ASD classification where we jointly learn from spatial and temporal data.
We employ 4D neural networks and convolutional-recurrent models which outperform a previous approach with an F1-score of 0.71 compared to an F1-score of 0.65.
arXiv Detail & Related papers (2020-04-21T17:19:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.