EEG Emotion Recognition Through Deep Learning
- URL: http://arxiv.org/abs/2511.15902v1
- Date: Wed, 19 Nov 2025 22:14:05 GMT
- Title: EEG Emotion Recognition Through Deep Learning
- Authors: Roman Dolgopolyi, Antonis Chatzipanagiotou,
- Abstract summary: The model achieved a testing accuracy of 91%, outperforming traditional models such as SVM, DNN, and Logistic Regression.<n>The model allows for the reduction of the requirements of the EEG apparatus, by leveraging only 5 electrodes of the 62.<n>This advancement sets the groundwork for future exploration into mood changes induced by media content consumption.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: An advanced emotion classification model was developed using a CNN-Transformer architecture for emotion recognition from EEG brain wave signals, effectively distinguishing among three emotional states, positive, neutral and negative. The model achieved a testing accuracy of 91%, outperforming traditional models such as SVM, DNN, and Logistic Regression. Training was conducted on a custom dataset created by merging data from SEED, SEED-FRA, and SEED-GER repositories, comprising 1,455 samples with EEG recordings labeled according to emotional states. The combined dataset represents one of the largest and most culturally diverse collections available. Additionally, the model allows for the reduction of the requirements of the EEG apparatus, by leveraging only 5 electrodes of the 62. This reduction demonstrates the feasibility of deploying a more affordable consumer-grade EEG headset, thereby enabling accessible, at-home use, while also requiring less computational power. This advancement sets the groundwork for future exploration into mood changes induced by media content consumption, an area that remains underresearched. Integration into medical, wellness, and home-health platforms could enable continuous, passive emotional monitoring, particularly beneficial in clinical or caregiving settings where traditional behavioral cues, such as facial expressions or vocal tone, are diminished, restricted, or difficult to interpret, thus potentially transforming mental health diagnostics and interventions...
Related papers
- E^2-LLM: Bridging Neural Signals and Interpretable Affective Analysis [54.763420895859035]
We present ELLM2-EEG-to-Emotion Large Language Model, first MLLM framework for interpretable emotion analysis from EEG.<n>ELLM integrates a pretrained EEG encoder with Q-based LLMs through learnable projection layers, employing a multi-stage training pipeline.<n>Experiments on the dataset across seven emotion categories demonstrate that ELLM2-EEG-to-Emotion Large Language Model achieves excellent performance on emotion classification.
arXiv Detail & Related papers (2026-01-11T13:21:20Z) - Leveraging Vision Transformers for Enhanced Classification of Emotions using ECG Signals [1.6018045082682821]
Biomedical signals offer insights into various conditions affecting the human body.<n> ECG data can reveal changes in heart rate variability linked to emotional arousal, stress levels, and autonomic nervous system activity.<n>Recent advancements in the field diverge from conventional approaches by leveraging the power of advanced transformer architectures.
arXiv Detail & Related papers (2025-10-07T11:49:57Z) - DEAP DIVE: Dataset Investigation with Vision transformers for EEG evaluation [11.8905212108669]
Accurately predicting emotions from brain signals has the potential to achieve goals such as improving mental health, human-computer interaction, and affective computing.<n>This work examines how subsets of EEG channels can be used for sufficiently accurate emotion prediction with low-cost EEG devices.
arXiv Detail & Related papers (2025-10-01T10:07:07Z) - A Real-Time BCI for Stroke Hand Rehabilitation Using Latent EEG Features from Healthy Subjects [0.0]
This study presents a real-time, portable brain-computer interface (BCI) system designed to support hand rehabilitation for stroke patients.<n>The system combines a low cost 3D-printed robotic exoskeleton with an embedded controller that converts brain signals into physical hand movements.
arXiv Detail & Related papers (2025-09-07T22:19:03Z) - CAST-Phys: Contactless Affective States Through Physiological signals Database [74.28082880875368]
The lack of affective multi-modal datasets remains a major bottleneck in developing accurate emotion recognition systems.<n>We present the Contactless Affective States Through Physiological Signals Database (CAST-Phys), a novel high-quality dataset capable of remote physiological emotion recognition.<n>Our analysis highlights the crucial role of physiological signals in realistic scenarios where facial expressions alone may not provide sufficient emotional information.
arXiv Detail & Related papers (2025-07-08T15:20:24Z) - BrainOmni: A Brain Foundation Model for Unified EEG and MEG Signals [46.121056431476156]
This paper proposes Brain Omni, the first brain foundation model that generalises across heterogeneous EEG and MEG recordings.<n>Existing approaches typically rely on separate, modality- and dataset-specific models, which limits performance and cross-domain scalability.<n>A total of 1,997 hours of EEG and 656 hours of MEG data are curated and standardised from publicly available sources for pretraining.
arXiv Detail & Related papers (2025-05-18T14:07:14Z) - CEReBrO: Compact Encoder for Representations of Brain Oscillations Using Efficient Alternating Attention [46.47343031985037]
We introduce a Compact for Representations of Brain Oscillations using alternating attention (CEReBrO)<n>Our tokenization scheme represents EEG signals at a per-channel patch.<n>We propose an alternating attention mechanism that jointly models intra-channel temporal dynamics and inter-channel spatial correlations, achieving 2x speed improvement with 6x less memory required compared to standard self-attention.
arXiv Detail & Related papers (2025-01-18T21:44:38Z) - EEG Emotion Copilot: Optimizing Lightweight LLMs for Emotional EEG Interpretation with Assisted Medical Record Generation [12.707059419820848]
This paper presents the EEG Emotion Copilot, which first recognizes emotional states directly from EEG signals.<n>It then generates personalized diagnostic and treatment suggestions, and finally supports the automation of assisted electronic medical records.<n>The proposed copilot is expected to advance the application of affective computing in the medical domain.
arXiv Detail & Related papers (2024-09-30T19:15:05Z) - DGSD: Dynamical Graph Self-Distillation for EEG-Based Auditory Spatial
Attention Detection [49.196182908826565]
Auditory Attention Detection (AAD) aims to detect target speaker from brain signals in a multi-speaker environment.
Current approaches primarily rely on traditional convolutional neural network designed for processing Euclidean data like images.
This paper proposes a dynamical graph self-distillation (DGSD) approach for AAD, which does not require speech stimuli as input.
arXiv Detail & Related papers (2023-09-07T13:43:46Z) - fMRI from EEG is only Deep Learning away: the use of interpretable DL to
unravel EEG-fMRI relationships [68.8204255655161]
We present an interpretable domain grounded solution to recover the activity of several subcortical regions from multichannel EEG data.
We recover individual spatial and time-frequency patterns of scalp EEG predictive of the hemodynamic signal in the subcortical nuclei.
arXiv Detail & Related papers (2022-10-23T15:11:37Z) - EEG2Vec: Learning Affective EEG Representations via Variational
Autoencoders [27.3162026528455]
We explore whether representing neural data, in response to emotional stimuli, in a latent vector space can serve to both predict emotional states.
We propose a conditional variational autoencoder based framework, EEG2Vec, to learn generative-discriminative representations from EEG data.
arXiv Detail & Related papers (2022-07-16T19:25:29Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.