Emotion-Oriented Behavior Model Using Deep Learning
- URL: http://arxiv.org/abs/2311.14674v1
- Date: Sat, 28 Oct 2023 17:27:59 GMT
- Title: Emotion-Oriented Behavior Model Using Deep Learning
- Authors: Muhammad Arslan Raza, Muhammad Shoaib Farooq, Adel Khelifi, Atif Alvi
- Abstract summary: The accuracy of emotion-based behavior predictions is statistically validated using the 2-tailed Pearson correlation.
This study is a steppingstone to a multi-faceted artificial agent interaction based on emotion-oriented behaviors.
- Score: 0.9176056742068812
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Emotions, as a fundamental ingredient of any social interaction, lead to
behaviors that represent the effectiveness of the interaction through facial
expressions and gestures in humans. Hence an agent must possess the social and
cognitive abilities to understand human social parameters and behave
accordingly. However, no such emotion-oriented behavior model is presented yet
in the existing research. The emotion prediction may generate appropriate
agents' behaviors for effective interaction using conversation modality.
Considering the importance of emotions, and behaviors, for an agent's social
interaction, an Emotion-based Behavior model is presented in this paper for
Socio-cognitive artificial agents. The proposed model is implemented using
tweets data trained on multiple models like Long Short-Term Memory (LSTM),
Convolution Neural Network (CNN) and Bidirectional Encoder Representations from
Transformers (BERT) for emotion prediction with an average accuracy of 92%, and
55% respectively. Further, using emotion predictions from CNN-LSTM, the
behavior module responds using facial expressions and gestures using Behavioral
Markup Language (BML). The accuracy of emotion-based behavior predictions is
statistically validated using the 2-tailed Pearson correlation on the data
collected from human users through questionnaires. Analysis shows that all
emotion-based behaviors accurately depict human-like gestures and facial
expressions based on the significant correlation at the 0.01 and 0.05 levels.
This study is a steppingstone to a multi-faceted artificial agent interaction
based on emotion-oriented behaviors. Cognition has significance regarding
social interaction among humans.
Related papers
- CAPE: A Chinese Dataset for Appraisal-based Emotional Generation using Large Language Models [30.40159858361768]
We introduce a two-stage automatic data generation framework to create CAPE, a Chinese dataset named Cognitive Appraisal theory-based Emotional corpus.
This corpus facilitates the generation of dialogues with contextually appropriate emotional responses by accounting for diverse personal and situational factors.
Our study shows the potential for advancing emotional expression in conversational agents, paving the way for more nuanced and meaningful human-computer interactions.
arXiv Detail & Related papers (2024-10-18T03:33:18Z) - Personality-affected Emotion Generation in Dialog Systems [67.40609683389947]
We propose a new task, Personality-affected Emotion Generation, to generate emotion based on the personality given to the dialog system.
We analyze the challenges in this task, i.e., (1) heterogeneously integrating personality and emotional factors and (2) extracting multi-granularity emotional information in the dialog context.
Results suggest that by adopting our method, the emotion generation performance is improved by 13% in macro-F1 and 5% in weighted-F1 from the BERT-base model.
arXiv Detail & Related papers (2024-04-03T08:48:50Z) - Language Models (Mostly) Do Not Consider Emotion Triggers When Predicting Emotion [87.18073195745914]
We investigate how well human-annotated emotion triggers correlate with features deemed salient in their prediction of emotions.
Using EmoTrigger, we evaluate the ability of large language models to identify emotion triggers.
Our analysis reveals that emotion triggers are largely not considered salient features for emotion prediction models, instead there is intricate interplay between various features and the task of emotion detection.
arXiv Detail & Related papers (2023-11-16T06:20:13Z) - Computer Vision Estimation of Emotion Reaction Intensity in the Wild [1.5481864635049696]
We describe our submission to the newly introduced Emotional Reaction Intensity (ERI) Estimation challenge.
We developed four deep neural networks trained in the visual domain and a multimodal model trained with both visual and audio features to predict emotion reaction intensity.
arXiv Detail & Related papers (2023-03-19T19:09:41Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - The world seems different in a social context: a neural network analysis
of human experimental data [57.729312306803955]
We show that it is possible to replicate human behavioral data in both individual and social task settings by modifying the precision of prior and sensory signals.
An analysis of the neural activation traces of the trained networks provides evidence that information is coded in fundamentally different ways in the network in the individual and in the social conditions.
arXiv Detail & Related papers (2022-03-03T17:19:12Z) - Enhancing Cognitive Models of Emotions with Representation Learning [58.2386408470585]
We present a novel deep learning-based framework to generate embedding representations of fine-grained emotions.
Our framework integrates a contextualized embedding encoder with a multi-head probing model.
Our model is evaluated on the Empathetic Dialogue dataset and shows the state-of-the-art result for classifying 32 emotions.
arXiv Detail & Related papers (2021-04-20T16:55:15Z) - Emotion pattern detection on facial videos using functional statistics [62.997667081978825]
We propose a technique based on Functional ANOVA to extract significant patterns of face muscles movements.
We determine if there are time-related differences on expressions among emotional groups by using a functional F-test.
arXiv Detail & Related papers (2021-03-01T08:31:08Z) - Prediction of Human Empathy based on EEG Cortical Asymmetry [0.0]
lateralization of brain oscillations at specific frequency bands is an important predictor of self-reported empathy scores.
Results could be employed in the development of brain-computer interfaces that assist people with difficulties in expressing or recognizing emotions.
arXiv Detail & Related papers (2020-05-06T13:49:56Z) - Generating Emotionally Aligned Responses in Dialogues using Affect
Control Theory [15.848210524718219]
Affect Control Theory (ACT) is a socio-mathematical model of emotions for human-human interactions.
We investigate how ACT can be used to develop affect-aware neural conversational agents.
arXiv Detail & Related papers (2020-03-07T19:31:08Z) - SensAI+Expanse Emotional Valence Prediction Studies with Cognition and
Memory Integration [0.0]
This work contributes with an artificial intelligent agent able to assist on cognitive science studies.
The developed artificial agent system (SensAI+Expanse) includes machine learning algorithms, empathetic algorithms, and memory.
Results of the present study show evidence of significant emotional behaviour differences between some age ranges and gender combinations.
arXiv Detail & Related papers (2020-01-03T18:17:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.