The BIRAFFE2 Experiment. Study in Bio-Reactions and Faces for
Emotion-based Personalization for AI Systems
- URL: http://arxiv.org/abs/2007.15048v2
- Date: Mon, 9 Nov 2020 20:11:03 GMT
- Title: The BIRAFFE2 Experiment. Study in Bio-Reactions and Faces for
Emotion-based Personalization for AI Systems
- Authors: Krzysztof Kutt (1), Dominika Dr\k{a}\.zyk (1), Maciej Szel\k{a}\.zek
(2), Szymon Bobek (1), Grzegorz J. Nalepa (1) ((1) Jagiellonian University,
Poland, (2) AGH University of Science and Technology, Poland)
- Abstract summary: We present an unified paradigm allowing to capture emotional responses of different persons.
We provide a framework that can be easily used and developed for the purpose of the machine learning methods.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The paper describes BIRAFFE2 data set, which is a result of an affective
computing experiment conducted between 2019 and 2020, that aimed to develop
computer models for classification and recognition of emotion. Such work is
important to develop new methods of natural Human-AI interaction. As we believe
that models of emotion should be personalized by design, we present an unified
paradigm allowing to capture emotional responses of different persons, taking
individual personality differences into account. We combine classical
psychological paradigms of emotional response collection with the newer
approach, based on the observation of the computer game player. By capturing
ones psycho-physiological reactions (ECG, EDA signal recording), mimic
expressions (facial emotion recognition), subjective valence-arousal balance
ratings (widget ratings) and gameplay progression (accelerometer and screencast
recording), we provide a framework that can be easily used and developed for
the purpose of the machine learning methods.
Related papers
- Personality-affected Emotion Generation in Dialog Systems [67.40609683389947]
We propose a new task, Personality-affected Emotion Generation, to generate emotion based on the personality given to the dialog system.
We analyze the challenges in this task, i.e., (1) heterogeneously integrating personality and emotional factors and (2) extracting multi-granularity emotional information in the dialog context.
Results suggest that by adopting our method, the emotion generation performance is improved by 13% in macro-F1 and 5% in weighted-F1 from the BERT-base model.
arXiv Detail & Related papers (2024-04-03T08:48:50Z) - Real-time EEG-based Emotion Recognition Model using Principal Component
Analysis and Tree-based Models for Neurohumanities [0.0]
This project proposes a solution by incorporating emotional monitoring during the learning process of context inside an immersive space.
A real-time emotion detection EEG-based system was developed to interpret and classify specific emotions.
This system aims to integrate emotional data into the Neurohumanities Lab interactive platform, creating a comprehensive and immersive learning environment.
arXiv Detail & Related papers (2024-01-28T20:02:13Z) - Computer Vision Estimation of Emotion Reaction Intensity in the Wild [1.5481864635049696]
We describe our submission to the newly introduced Emotional Reaction Intensity (ERI) Estimation challenge.
We developed four deep neural networks trained in the visual domain and a multimodal model trained with both visual and audio features to predict emotion reaction intensity.
arXiv Detail & Related papers (2023-03-19T19:09:41Z) - Facial Expression Recognition using Squeeze and Excitation-powered Swin
Transformers [0.0]
We propose a framework that employs Swin Vision Transformers (SwinT) and squeeze and excitation block (SE) to address vision tasks.
Our focus was to create an efficient FER model based on SwinT architecture that can recognize facial emotions using minimal data.
We trained our model on a hybrid dataset and evaluated its performance on the AffectNet dataset, achieving an F1-score of 0.5420.
arXiv Detail & Related papers (2023-01-26T02:29:17Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - Multimodal Emotion Recognition using Transfer Learning from Speaker
Recognition and BERT-based models [53.31917090073727]
We propose a neural network-based emotion recognition framework that uses a late fusion of transfer-learned and fine-tuned models from speech and text modalities.
We evaluate the effectiveness of our proposed multimodal approach on the interactive emotional dyadic motion capture dataset.
arXiv Detail & Related papers (2022-02-16T00:23:42Z) - Multi-Task Learning of Generation and Classification for Emotion-Aware
Dialogue Response Generation [9.398596037077152]
We propose a neural response generation model with multi-task learning of generation and classification, focusing on emotion.
Our model based on BART, a pre-trained transformer encoder-decoder model, is trained to generate responses and recognize emotions simultaneously.
arXiv Detail & Related papers (2021-05-25T06:41:20Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - Language Models as Emotional Classifiers for Textual Conversations [3.04585143845864]
We present a novel methodology for classifying emotion in a conversation.
At the backbone of our proposed methodology is a pre-trained Language Model (LM)
We apply our proposed methodology on the IEMOCAP and Friends data sets.
arXiv Detail & Related papers (2020-08-27T20:04:30Z) - Continuous Emotion Recognition via Deep Convolutional Autoencoder and
Support Vector Regressor [70.2226417364135]
It is crucial that the machine should be able to recognize the emotional state of the user with high accuracy.
Deep neural networks have been used with great success in recognizing emotions.
We present a new model for continuous emotion recognition based on facial expression recognition.
arXiv Detail & Related papers (2020-01-31T17:47:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.