EEG-based Emotion Style Transfer Network for Cross-dataset Emotion
Recognition
- URL: http://arxiv.org/abs/2308.05767v1
- Date: Wed, 9 Aug 2023 16:54:40 GMT
- Title: EEG-based Emotion Style Transfer Network for Cross-dataset Emotion
Recognition
- Authors: Yijin Zhou, Fu Li, Yang Li, Youshuo Ji, Lijian Zhang, Yuanfang Chen,
Wenming Zheng, Guangming Shi
- Abstract summary: We propose an EEG-based Emotion Style Transfer Network (E2STN) to obtain EEG representations that contain the content information of source domain and the style information of target domain.
The E2STN can achieve the state-of-the-art performance on cross-dataset EEG emotion recognition tasks.
- Score: 45.26847258736848
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As the key to realizing aBCIs, EEG emotion recognition has been widely
studied by many researchers. Previous methods have performed well for
intra-subject EEG emotion recognition. However, the style mismatch between
source domain (training data) and target domain (test data) EEG samples caused
by huge inter-domain differences is still a critical problem for EEG emotion
recognition. To solve the problem of cross-dataset EEG emotion recognition, in
this paper, we propose an EEG-based Emotion Style Transfer Network (E2STN) to
obtain EEG representations that contain the content information of source
domain and the style information of target domain, which is called stylized
emotional EEG representations. The representations are helpful for
cross-dataset discriminative prediction. Concretely, E2STN consists of three
modules, i.e., transfer module, transfer evaluation module, and discriminative
prediction module. The transfer module encodes the domain-specific information
of source and target domains and then re-constructs the source domain's
emotional pattern and the target domain's statistical characteristics into the
new stylized EEG representations. In this process, the transfer evaluation
module is adopted to constrain the generated representations that can more
precisely fuse two kinds of complementary information from source and target
domains and avoid distorting. Finally, the generated stylized EEG
representations are fed into the discriminative prediction module for final
classification. Extensive experiments show that the E2STN can achieve the
state-of-the-art performance on cross-dataset EEG emotion recognition tasks.
Related papers
- DuA: Dual Attentive Transformer in Long-Term Continuous EEG Emotion Analysis [15.858955204180907]
We propose a Dual Attentive (DuA) transformer framework for long-term continuous EEG emotion analysis.
Unlike segment-based approaches, the DuA transformer processes an entire EEG trial as a whole, identifying emotions at the trial level.
This framework is designed to adapt to varying signal lengths, providing a substantial advantage over traditional methods.
arXiv Detail & Related papers (2024-07-30T03:31:03Z) - Joint Contrastive Learning with Feature Alignment for Cross-Corpus EEG-based Emotion Recognition [2.1645626994550664]
We propose a novel Joint Contrastive learning framework with Feature Alignment to address cross-corpus EEG-based emotion recognition.
In the pre-training stage, a joint domain contrastive learning strategy is introduced to characterize generalizable time-frequency representations of EEG signals.
In the fine-tuning stage, JCFA is refined in conjunction with downstream tasks, where the structural connections among brain electrodes are considered.
arXiv Detail & Related papers (2024-04-15T08:21:17Z) - Enhancing EEG-to-Text Decoding through Transferable Representations from Pre-trained Contrastive EEG-Text Masked Autoencoder [69.7813498468116]
We propose Contrastive EEG-Text Masked Autoencoder (CET-MAE), a novel model that orchestrates compound self-supervised learning across and within EEG and text.
We also develop a framework called E2T-PTR (EEG-to-Text decoding using Pretrained Transferable Representations) to decode text from EEG sequences.
arXiv Detail & Related papers (2024-02-27T11:45:21Z) - Semi-Supervised Dual-Stream Self-Attentive Adversarial Graph Contrastive Learning for Cross-Subject EEG-based Emotion Recognition [19.578050094283313]
The DS-AGC framework is proposed to tackle the challenge of limited labeled data in cross-subject EEG-based emotion recognition.
The proposed model outperforms existing methods under different incomplete label conditions.
arXiv Detail & Related papers (2023-08-13T23:54:40Z) - EEGMatch: Learning with Incomplete Labels for Semi-Supervised EEG-based Cross-Subject Emotion Recognition [7.1695247553867345]
We propose a novel semi-supervised learning framework (EEGMatch) to leverage both labeled and unlabeled EEG data.
Extensive experiments are conducted on two benchmark databases (SEED and SEED-IV)
arXiv Detail & Related papers (2023-03-27T12:02:33Z) - EEG2Vec: Learning Affective EEG Representations via Variational
Autoencoders [27.3162026528455]
We explore whether representing neural data, in response to emotional stimuli, in a latent vector space can serve to both predict emotional states.
We propose a conditional variational autoencoder based framework, EEG2Vec, to learn generative-discriminative representations from EEG data.
arXiv Detail & Related papers (2022-07-16T19:25:29Z) - Multimodal Emotion Recognition using Transfer Learning from Speaker
Recognition and BERT-based models [53.31917090073727]
We propose a neural network-based emotion recognition framework that uses a late fusion of transfer-learned and fine-tuned models from speech and text modalities.
We evaluate the effectiveness of our proposed multimodal approach on the interactive emotional dyadic motion capture dataset.
arXiv Detail & Related papers (2022-02-16T00:23:42Z) - EEG-Inception: An Accurate and Robust End-to-End Neural Network for
EEG-based Motor Imagery Classification [123.93460670568554]
This paper proposes a novel convolutional neural network (CNN) architecture for accurate and robust EEG-based motor imagery (MI) classification.
The proposed CNN model, namely EEG-Inception, is built on the backbone of the Inception-Time network.
The proposed network is an end-to-end classification, as it takes the raw EEG signals as the input and does not require complex EEG signal-preprocessing.
arXiv Detail & Related papers (2021-01-24T19:03:10Z) - Emotional Semantics-Preserved and Feature-Aligned CycleGAN for Visual
Emotion Adaptation [85.20533077846606]
Unsupervised domain adaptation (UDA) studies the problem of transferring models trained on one labeled source domain to another unlabeled target domain.
In this paper, we focus on UDA in visual emotion analysis for both emotion distribution learning and dominant emotion classification.
We propose a novel end-to-end cycle-consistent adversarial model, termed CycleEmotionGAN++.
arXiv Detail & Related papers (2020-11-25T01:31:01Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.