Towards emotion recognition for virtual environments: an evaluation of
EEG features on benchmark dataset
- URL: http://arxiv.org/abs/2210.13876v1
- Date: Tue, 25 Oct 2022 10:02:55 GMT
- Title: Towards emotion recognition for virtual environments: an evaluation of
EEG features on benchmark dataset
- Authors: M. L. Menezes, A. Samara, L. Galway, A. Sant'anna, A. Verikas, F.
Alonso-Fernandez, H. Wang, R. Bond
- Abstract summary: This paper investigates features extracted from electroencephalogram signals for the purpose of affective state modelling.
It aims to provide the foundation for future work in modelling user affect to enhance interaction experience in virtual environments.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One of the challenges in virtual environments is the difficulty users have in
interacting with these increasingly complex systems. Ultimately, endowing
machines with the ability to perceive users emotions will enable a more
intuitive and reliable interaction. Consequently, using the
electroencephalogram as a bio-signal sensor, the affective state of a user can
be modelled and subsequently utilised in order to achieve a system that can
recognise and react to the user's emotions. This paper investigates features
extracted from electroencephalogram signals for the purpose of affective state
modelling based on Russell's Circumplex Model. Investigations are presented
that aim to provide the foundation for future work in modelling user affect to
enhance interaction experience in virtual environments. The DEAP dataset was
used within this work, along with a Support Vector Machine and Random Forest,
which yielded reasonable classification accuracies for Valence and Arousal
using feature vectors based on statistical measurements and band power from the
\'z, \b{eta}, \'z, and \'z\'z waves and High Order Crossing of the EEG signal.
Related papers
- Emotion Recognition from the perspective of Activity Recognition [0.0]
Appraising human emotional states, behaviors, and reactions displayed in real-world settings can be accomplished using latent continuous dimensions.
For emotion recognition systems to be deployed and integrated into real-world mobile and computing devices, we need to consider data collected in the world.
We propose a novel three-stream end-to-end deep learning regression pipeline with an attention mechanism.
arXiv Detail & Related papers (2024-03-24T18:53:57Z) - Inter Subject Emotion Recognition Using Spatio-Temporal Features From
EEG Signal [4.316570025748204]
This work is about an easy-to-implement emotion recognition model that classifies emotions from EEG signals subject independently.
The model is a combination of regular, depthwise and separable convolution layers of CNN to classify the emotions.
The model achieved an accuracy of 73.04%.
arXiv Detail & Related papers (2023-05-27T07:43:19Z) - EEG2Vec: Learning Affective EEG Representations via Variational
Autoencoders [27.3162026528455]
We explore whether representing neural data, in response to emotional stimuli, in a latent vector space can serve to both predict emotional states.
We propose a conditional variational autoencoder based framework, EEG2Vec, to learn generative-discriminative representations from EEG data.
arXiv Detail & Related papers (2022-07-16T19:25:29Z) - Transformer-Based Self-Supervised Learning for Emotion Recognition [0.0]
We propose to use a Transformer-based model to process electrocardiograms (ECG) for emotion recognition.
To overcome the relatively small size of datasets with emotional labels, we employ self-supervised learning.
We show that our approach reaches state-of-the-art performances for emotion recognition using ECG signals on AMIGOS.
arXiv Detail & Related papers (2022-04-08T07:14:55Z) - Multimodal Emotion Recognition using Transfer Learning from Speaker
Recognition and BERT-based models [53.31917090073727]
We propose a neural network-based emotion recognition framework that uses a late fusion of transfer-learned and fine-tuned models from speech and text modalities.
We evaluate the effectiveness of our proposed multimodal approach on the interactive emotional dyadic motion capture dataset.
arXiv Detail & Related papers (2022-02-16T00:23:42Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Towards Unbiased Visual Emotion Recognition via Causal Intervention [63.74095927462]
We propose a novel Emotion Recognition Network (IERN) to alleviate the negative effects brought by the dataset bias.
A series of designed tests validate the effectiveness of IERN, and experiments on three emotion benchmarks demonstrate that IERN outperforms other state-of-the-art approaches.
arXiv Detail & Related papers (2021-07-26T10:40:59Z) - Cross-individual Recognition of Emotions by a Dynamic Entropy based on
Pattern Learning with EEG features [2.863100352151122]
We propose a deep-learning framework denoted as a dynamic entropy-based pattern learning (DEPL) to abstract informative indicators pertaining to the neurophysiological features among multiple individuals.
DEPL enhanced the capability of representations generated by a deep convolutional neural network by modelling the interdependencies between the cortical locations of dynamical entropy based features.
arXiv Detail & Related papers (2020-09-26T07:22:07Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z) - Continuous Emotion Recognition via Deep Convolutional Autoencoder and
Support Vector Regressor [70.2226417364135]
It is crucial that the machine should be able to recognize the emotional state of the user with high accuracy.
Deep neural networks have been used with great success in recognizing emotions.
We present a new model for continuous emotion recognition based on facial expression recognition.
arXiv Detail & Related papers (2020-01-31T17:47:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.