Hybrid Quantum Deep Learning Model for Emotion Detection using raw EEG Signal Analysis
- URL: http://arxiv.org/abs/2411.17715v1
- Date: Tue, 19 Nov 2024 17:44:04 GMT
- Title: Hybrid Quantum Deep Learning Model for Emotion Detection using raw EEG Signal Analysis
- Authors: Ali Asgar Chandanwala, Srutakirti Bhowmik, Parna Chaudhury, Sheena Christabel Pravin,
- Abstract summary: This work presents a hybrid quantum deep learning technique for emotion recognition.
Conventional EEG-based emotion recognition techniques are limited by noise and high-dimensional data complexity.
The model will be extended for real-time applications and multi-class categorization in future study.
- Score: 0.0
- License:
- Abstract: Applications in behavioural research, human-computer interaction, and mental health depend on the ability to recognize emotions. In order to improve the accuracy of emotion recognition using electroencephalography (EEG) data, this work presents a hybrid quantum deep learning technique. Conventional EEG-based emotion recognition techniques are limited by noise and high-dimensional data complexity, which make feature extraction difficult. To tackle these issues, our method combines traditional deep learning classification with quantum-enhanced feature extraction. To identify important brain wave patterns, Bandpass filtering and Welch method are used as preprocessing techniques on EEG data. Intricate inter-band interactions that are essential for determining emotional states are captured by mapping frequency band power attributes (delta, theta, alpha, and beta) to quantum representations. Entanglement and rotation gates are used in a hybrid quantum circuit to maximize the model's sensitivity to EEG patterns associated with different emotions. Promising results from evaluation on a test dataset indicate the model's potential for accurate emotion recognition. The model will be extended for real-time applications and multi-class categorization in future study, which could improve EEG-based mental health screening instruments. This method offers a promising tool for applications in adaptive human-computer systems and mental health monitoring by showcasing the possibilities of fusing traditional deep learning with quantum processing for reliable, scalable emotion recognition.
Related papers
- EEG Emotion Copilot: Optimizing Lightweight LLMs for Emotional EEG Interpretation with Assisted Medical Record Generation [12.707059419820848]
This paper presents the EEG Emotion Copilot, which first recognizes emotional states directly from EEG signals.
It then generates personalized diagnostic and treatment suggestions, and finally supports the automation of assisted electronic medical records.
The proposed copilot is expected to advance the application of affective computing in the medical domain.
arXiv Detail & Related papers (2024-09-30T19:15:05Z) - A Knowledge-Driven Cross-view Contrastive Learning for EEG
Representation [48.85731427874065]
This paper proposes a knowledge-driven cross-view contrastive learning framework (KDC2) to extract effective representations from EEG with limited labels.
The KDC2 method creates scalp and neural views of EEG signals, simulating the internal and external representation of brain activity.
By modeling prior neural knowledge based on neural information consistency theory, the proposed method extracts invariant and complementary neural knowledge to generate combined representations.
arXiv Detail & Related papers (2023-09-21T08:53:51Z) - Emotion recognition based on multi-modal electrophysiology multi-head
attention Contrastive Learning [3.2536246345549538]
We propose ME-MHACL, a self-supervised contrastive learning-based multimodal emotion recognition method.
We apply the trained feature extractor to labeled electrophysiological signals and use multi-head attention mechanisms for feature fusion.
Our method outperformed existing benchmark methods in emotion recognition tasks and had good cross-individual generalization ability.
arXiv Detail & Related papers (2023-07-12T05:55:40Z) - A Convolutional Spiking Network for Gesture Recognition in
Brain-Computer Interfaces [0.8122270502556371]
We propose a simple yet efficient machine learning-based approach for the exemplary problem of hand gesture classification based on brain signals.
We demonstrate that this approach generalizes to different subjects with both EEG and ECoG data and achieves superior accuracy in the range of 92.74-97.07%.
arXiv Detail & Related papers (2023-04-21T16:23:40Z) - Optimized EEG based mood detection with signal processing and deep
neural networks for brain-computer interface [0.0]
The aim of this study is to establish a smart decision-making model to identify EEG's relation with the mood of the subject.
EEG signals of 28 healthy human subjects have been observed with consent and attempts have been made to study and recognise moods.
Using these techniques, up to 96.01% detection accuracy has been obtained.
arXiv Detail & Related papers (2023-03-30T15:23:24Z) - fMRI from EEG is only Deep Learning away: the use of interpretable DL to
unravel EEG-fMRI relationships [68.8204255655161]
We present an interpretable domain grounded solution to recover the activity of several subcortical regions from multichannel EEG data.
We recover individual spatial and time-frequency patterns of scalp EEG predictive of the hemodynamic signal in the subcortical nuclei.
arXiv Detail & Related papers (2022-10-23T15:11:37Z) - Multimodal Emotion Recognition using Transfer Learning from Speaker
Recognition and BERT-based models [53.31917090073727]
We propose a neural network-based emotion recognition framework that uses a late fusion of transfer-learned and fine-tuned models from speech and text modalities.
We evaluate the effectiveness of our proposed multimodal approach on the interactive emotional dyadic motion capture dataset.
arXiv Detail & Related papers (2022-02-16T00:23:42Z) - Improved Speech Emotion Recognition using Transfer Learning and
Spectrogram Augmentation [56.264157127549446]
Speech emotion recognition (SER) is a challenging task that plays a crucial role in natural human-computer interaction.
One of the main challenges in SER is data scarcity.
We propose a transfer learning strategy combined with spectrogram augmentation.
arXiv Detail & Related papers (2021-08-05T10:39:39Z) - Cross-individual Recognition of Emotions by a Dynamic Entropy based on
Pattern Learning with EEG features [2.863100352151122]
We propose a deep-learning framework denoted as a dynamic entropy-based pattern learning (DEPL) to abstract informative indicators pertaining to the neurophysiological features among multiple individuals.
DEPL enhanced the capability of representations generated by a deep convolutional neural network by modelling the interdependencies between the cortical locations of dynamical entropy based features.
arXiv Detail & Related papers (2020-09-26T07:22:07Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z) - Continuous Emotion Recognition via Deep Convolutional Autoencoder and
Support Vector Regressor [70.2226417364135]
It is crucial that the machine should be able to recognize the emotional state of the user with high accuracy.
Deep neural networks have been used with great success in recognizing emotions.
We present a new model for continuous emotion recognition based on facial expression recognition.
arXiv Detail & Related papers (2020-01-31T17:47:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.