Human Emotion Classification based on EEG Signals Using Recurrent Neural
Network And KNN
- URL: http://arxiv.org/abs/2205.08419v1
- Date: Tue, 10 May 2022 16:20:14 GMT
- Title: Human Emotion Classification based on EEG Signals Using Recurrent Neural
Network And KNN
- Authors: Shashank Joshi and Falak Joshi
- Abstract summary: emotion categorization from EEG data has recently gotten a lot of attention.
EEG signals are a critical resource for brain-computer interfaces.
EEG signals associated with good, neutral, and negative emotions were identified using channel selection preprocessing.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In human contact, emotion is very crucial. Attributes like words, voice
intonation, facial expressions, and kinesics can all be used to portray one's
feelings. However, brain-computer interface (BCI) devices have not yet reached
the level required for emotion interpretation. With the rapid development of
machine learning algorithms, dry electrode techniques, and different real-world
applications of the brain-computer interface for normal individuals, emotion
categorization from EEG data has recently gotten a lot of attention.
Electroencephalogram (EEG) signals are a critical resource for these systems.
The primary benefit of employing EEG signals is that they reflect true emotion
and are easily resolved by computer systems. In this work, EEG signals
associated with good, neutral, and negative emotions were identified using
channel selection preprocessing. However, researchers had a limited grasp of
the specifics of the link between various emotional states until now. To
identify EEG signals, we used discrete wavelet transform and machine learning
techniques such as recurrent neural network (RNN) and k-nearest neighbor (kNN)
algorithm. Initially, the classifier methods were utilized for channel
selection. As a result, final feature vectors were created by integrating the
features of EEG segments from these channels. Using the RNN and kNN algorithms,
the final feature vectors with connected positive, neutral, and negative
emotions were categorized independently. The classification performance of both
techniques is computed and compared. Using RNN and kNN, the average overall
accuracies were 94.844 % and 93.438 %, respectively.
Related papers
- Implementation of AI Deep Learning Algorithm For Multi-Modal Sentiment
Analysis [0.9065034043031668]
A multi-modal emotion recognition method was established by combining two-channel convolutional neural network with ring network.
The words were vectorized with GloVe, and the word vector was input into the convolutional neural network.
arXiv Detail & Related papers (2023-11-19T05:49:39Z) - Unveiling Emotions from EEG: A GRU-Based Approach [2.580765958706854]
Gated Recurrent Unit (GRU) algorithm is tested to see if it can use EEG signals to predict emotional states.
Our publicly accessible dataset consists of resting neutral data as well as EEG recordings from people who were exposed to stimuli evoking happy, neutral, and negative emotions.
arXiv Detail & Related papers (2023-07-20T11:04:46Z) - Emotion Analysis on EEG Signal Using Machine Learning and Neural Network [0.0]
The main purpose of this study is to improve ways to improve emotion recognition performance using brain signals.
Various approaches to human-machine interaction technologies have been ongoing for a long time, and in recent years, researchers have had great success in automatically understanding emotion using brain signals.
arXiv Detail & Related papers (2023-07-09T09:50:34Z) - A Convolutional Spiking Network for Gesture Recognition in
Brain-Computer Interfaces [0.8122270502556371]
We propose a simple yet efficient machine learning-based approach for the exemplary problem of hand gesture classification based on brain signals.
We demonstrate that this approach generalizes to different subjects with both EEG and ECoG data and achieves superior accuracy in the range of 92.74-97.07%.
arXiv Detail & Related papers (2023-04-21T16:23:40Z) - Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot
Sentiment Classification [78.120927891455]
State-of-the-art brain-to-text systems have achieved great success in decoding language directly from brain signals using neural networks.
In this paper, we extend the problem to open vocabulary Electroencephalography(EEG)-To-Text Sequence-To-Sequence decoding and zero-shot sentence sentiment classification on natural reading tasks.
Our model achieves a 40.1% BLEU-1 score on EEG-To-Text decoding and a 55.6% F1 score on zero-shot EEG-based ternary sentiment classification, which significantly outperforms supervised baselines.
arXiv Detail & Related papers (2021-12-05T21:57:22Z) - Emotional EEG Classification using Connectivity Features and
Convolutional Neural Networks [81.74442855155843]
We introduce a new classification system that utilizes brain connectivity with a CNN and validate its effectiveness via the emotional video classification.
The level of concentration of the brain connectivity related to the emotional property of the target video is correlated with classification performance.
arXiv Detail & Related papers (2021-01-18T13:28:08Z) - Attention Driven Fusion for Multi-Modal Emotion Recognition [39.295892047505816]
We present a deep learning-based approach to exploit and fuse text and acoustic data for emotion classification.
We use a SincNet layer, based on parameterized sinc functions with band-pass filters, to extract acoustic features from raw audio followed by a DCNN.
For text processing, we use two branches (a DCNN and a Bi-direction RNN followed by a DCNN) in parallel where cross attention is introduced to infer the N-gram level correlations.
arXiv Detail & Related papers (2020-09-23T08:07:58Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z) - An End-to-End Visual-Audio Attention Network for Emotion Recognition in
User-Generated Videos [64.91614454412257]
We propose to recognize video emotions in an end-to-end manner based on convolutional neural networks (CNNs)
Specifically, we develop a deep Visual-Audio Attention Network (VAANet), a novel architecture that integrates spatial, channel-wise, and temporal attentions into a visual 3D CNN and temporal attentions into an audio 2D CNN.
arXiv Detail & Related papers (2020-02-12T15:33:59Z) - EEG-based Brain-Computer Interfaces (BCIs): A Survey of Recent Studies
on Signal Sensing Technologies and Computational Intelligence Approaches and
their Applications [65.32004302942218]
Brain-Computer Interface (BCI) is a powerful communication tool between users and systems.
Recent technological advances have increased interest in electroencephalographic (EEG) based BCI for translational and healthcare applications.
arXiv Detail & Related papers (2020-01-28T10:36:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.