Emotion Recognition With Temporarily Localized 'Emotional Events' in
Naturalistic Context
- URL: http://arxiv.org/abs/2211.02637v1
- Date: Tue, 25 Oct 2022 10:01:40 GMT
- Title: Emotion Recognition With Temporarily Localized 'Emotional Events' in
Naturalistic Context
- Authors: Mohammad Asif and Sudhakar Mishra and Majithia Tejas Vinodbhai and Uma
Shanker Tiwary
- Abstract summary: We use EEG signals to classify emotional events on different combinations of Valence(V) and Arousal(A) dimensions.
Having precise information about emotional feelings improves the classification accuracy compared to long-duration EEG signals.
- Score: 1.7205106391379026
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Emotion recognition using EEG signals is an emerging area of research due to
its broad applicability in BCI. Emotional feelings are hard to stimulate in the
lab. Emotions do not last long, yet they need enough context to be perceived
and felt. However, most EEG-related emotion databases either suffer from
emotionally irrelevant details (due to prolonged duration stimulus) or have
minimal context doubting the feeling of any emotion using the stimulus. We
tried to reduce the impact of this trade-off by designing an experiment in
which participants are free to report their emotional feelings simultaneously
watching the emotional stimulus. We called these reported emotional feelings
"Emotional Events" in our Dataset on Emotion with Naturalistic Stimuli (DENS).
We used EEG signals to classify emotional events on different combinations of
Valence(V) and Arousal(A) dimensions and compared the results with benchmark
datasets of DEAP and SEED. STFT is used for feature extraction and used in the
classification model consisting of CNN-LSTM hybrid layers. We achieved
significantly higher accuracy with our data compared to DEEP and SEED data. We
conclude that having precise information about emotional feelings improves the
classification accuracy compared to long-duration EEG signals which might be
contaminated by mind-wandering.
Related papers
- Smile upon the Face but Sadness in the Eyes: Emotion Recognition based on Facial Expressions and Eye Behaviors [63.194053817609024]
We introduce eye behaviors as an important emotional cues for the creation of a new Eye-behavior-aided Multimodal Emotion Recognition dataset.
For the first time, we provide annotations for both Emotion Recognition (ER) and Facial Expression Recognition (FER) in the EMER dataset.
We specifically design a new EMERT architecture to concurrently enhance performance in both ER and FER.
arXiv Detail & Related papers (2024-11-08T04:53:55Z) - EMO-KNOW: A Large Scale Dataset on Emotion and Emotion-cause [8.616061735005314]
We introduce a large-scale dataset of emotion causes, derived from 9.8 million cleaned tweets over 15 years.
The novelty of our dataset stems from its broad spectrum of emotion classes and the abstractive emotion cause.
Our dataset will enable the design of emotion-aware systems that account for the diverse emotional responses of different people.
arXiv Detail & Related papers (2024-06-18T08:26:33Z) - Think out Loud: Emotion Deducing Explanation in Dialogues [57.90554323226896]
We propose a new task "Emotion Deducing Explanation in Dialogues" (EDEN)
EDEN recognizes emotion and causes in an explicitly thinking way.
It can help Large Language Models (LLMs) achieve better recognition of emotions and causes.
arXiv Detail & Related papers (2024-06-07T08:58:29Z) - Language Models (Mostly) Do Not Consider Emotion Triggers When Predicting Emotion [87.18073195745914]
We investigate how well human-annotated emotion triggers correlate with features deemed salient in their prediction of emotions.
Using EmoTrigger, we evaluate the ability of large language models to identify emotion triggers.
Our analysis reveals that emotion triggers are largely not considered salient features for emotion prediction models, instead there is intricate interplay between various features and the task of emotion detection.
arXiv Detail & Related papers (2023-11-16T06:20:13Z) - Emotion Analysis on EEG Signal Using Machine Learning and Neural Network [0.0]
The main purpose of this study is to improve ways to improve emotion recognition performance using brain signals.
Various approaches to human-machine interaction technologies have been ongoing for a long time, and in recent years, researchers have had great success in automatically understanding emotion using brain signals.
arXiv Detail & Related papers (2023-07-09T09:50:34Z) - Inter Subject Emotion Recognition Using Spatio-Temporal Features From
EEG Signal [4.316570025748204]
This work is about an easy-to-implement emotion recognition model that classifies emotions from EEG signals subject independently.
The model is a combination of regular, depthwise and separable convolution layers of CNN to classify the emotions.
The model achieved an accuracy of 73.04%.
arXiv Detail & Related papers (2023-05-27T07:43:19Z) - Emotion Intensity and its Control for Emotional Voice Conversion [77.05097999561298]
Emotional voice conversion (EVC) seeks to convert the emotional state of an utterance while preserving the linguistic content and speaker identity.
In this paper, we aim to explicitly characterize and control the intensity of emotion.
We propose to disentangle the speaker style from linguistic content and encode the speaker style into a style embedding in a continuous space that forms the prototype of emotion embedding.
arXiv Detail & Related papers (2022-01-10T02:11:25Z) - Progressive Graph Convolution Network for EEG Emotion Recognition [35.08010382523394]
Studies in the area of neuroscience have revealed the relationship between emotional patterns and brain functional regions.
In EEG emotion recognition, we can observe that clearer boundaries exist between coarse-grained emotions than those between fine-grained emotions.
We propose a progressive graph convolution network (PGCN) for capturing this inherent characteristic in EEG emotional signals.
arXiv Detail & Related papers (2021-12-14T03:30:13Z) - SOLVER: Scene-Object Interrelated Visual Emotion Reasoning Network [83.27291945217424]
We propose a novel Scene-Object interreLated Visual Emotion Reasoning network (SOLVER) to predict emotions from images.
To mine the emotional relationships between distinct objects, we first build up an Emotion Graph based on semantic concepts and visual features.
We also design a Scene-Object Fusion Module to integrate scenes and objects, which exploits scene features to guide the fusion process of object features with the proposed scene-based attention mechanism.
arXiv Detail & Related papers (2021-10-24T02:41:41Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - Detecting Emotion Primitives from Speech and their use in discerning
Categorical Emotions [16.886826928295203]
Emotion plays an essential role in human-to-human communication, enabling us to convey feelings such as happiness, frustration, and sincerity.
This work investigated how emotion primitives can be used to detect categorical emotions such as happiness, disgust, contempt, anger, and surprise from neutral speech.
Results indicated that arousal, followed by dominance was a better detector of such emotions.
arXiv Detail & Related papers (2020-01-31T03:11:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.