Modeling User Empathy Elicited by a Robot Storyteller
- URL: http://arxiv.org/abs/2107.14345v1
- Date: Thu, 29 Jul 2021 21:56:19 GMT
- Title: Modeling User Empathy Elicited by a Robot Storyteller
- Authors: Leena Mathur, Micol Spitale, Hao Xi, Jieyun Li, Maja J Matari\'c
- Abstract summary: We present the first approach to modeling user empathy elicited during interactions with a robotic agent.
We conducted experiments with 8 classical machine learning models and 2 deep learning models to detect empathy.
Our highest-performing approach, based on XGBoost, achieved an accuracy of 69% and AUC of 72% when detecting empathy in videos.
- Score: 2.309914459672557
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Virtual and robotic agents capable of perceiving human empathy have the
potential to participate in engaging and meaningful human-machine interactions
that support human well-being. Prior research in computational empathy has
focused on designing empathic agents that use verbal and nonverbal behaviors to
simulate empathy and attempt to elicit empathic responses from humans. The
challenge of developing agents with the ability to automatically perceive
elicited empathy in humans remains largely unexplored. Our paper presents the
first approach to modeling user empathy elicited during interactions with a
robotic agent. We collected a new dataset from the novel interaction context of
participants listening to a robot storyteller (46 participants, 6.9 hours of
video). After each storytelling interaction, participants answered a
questionnaire that assessed their level of elicited empathy during the
interaction with the robot. We conducted experiments with 8 classical machine
learning models and 2 deep learning models (long short-term memory networks and
temporal convolutional networks) to detect empathy by leveraging patterns in
participants' visual behaviors while they were listening to the robot
storyteller. Our highest-performing approach, based on XGBoost, achieved an
accuracy of 69% and AUC of 72% when detecting empathy in videos. We contribute
insights regarding modeling approaches and visual features for automated
empathy detection. Our research informs and motivates future development of
empathy perception models that can be leveraged by virtual and robotic agents
during human-machine interactions.
Related papers
- EmpathicStories++: A Multimodal Dataset for Empathy towards Personal Experiences [19.626851022750067]
EmpathicStories++ is the first longitudinal dataset on empathy, collected over a month-long deployment of social robots in participants' homes.
We introduce a novel task of predicting individuals' empathy toward others' stories based on their personal experiences, evaluated in two contexts.
arXiv Detail & Related papers (2024-05-24T16:57:18Z) - Artificial Empathy Classification: A Survey of Deep Learning Techniques,
Datasets, and Evaluation Scales [0.0]
This paper aims to investigate and evaluate existing works for measuring and evaluating empathy, as well as the datasets that have been collected and used so far.
Our goal is to highlight and facilitate the use of state-of-the-art methods in the area of AE by comparing their performance.
arXiv Detail & Related papers (2023-09-04T16:02:59Z) - Affordances from Human Videos as a Versatile Representation for Robotics [31.248842798600606]
We train a visual affordance model that estimates where and how in the scene a human is likely to interact.
The structure of these behavioral affordances directly enables the robot to perform many complex tasks.
We show the efficacy of our approach, which we call VRB, across 4 real world environments, over 10 different tasks, and 2 robotic platforms operating in the wild.
arXiv Detail & Related papers (2023-04-17T17:59:34Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - How Can AI Recognize Pain and Express Empathy [18.71528144336154]
Sensory and emotional experiences such as pain and empathy are relevant to mental and physical health.
The current drive for automated pain recognition is motivated by a growing number of healthcare requirements.
We review the current developments for computational pain recognition and artificial empathy implementation.
arXiv Detail & Related papers (2021-10-08T16:58:57Z) - CheerBots: Chatbots toward Empathy and Emotionusing Reinforcement
Learning [60.348822346249854]
This study presents a framework whereby several empathetic chatbots are based on understanding users' implied feelings and replying empathetically for multiple dialogue turns.
We call these chatbots CheerBots. CheerBots can be retrieval-based or generative-based and were finetuned by deep reinforcement learning.
To respond in an empathetic way, we develop a simulating agent, a Conceptual Human Model, as aids for CheerBots in training with considerations on changes in user's emotional states in the future to arouse sympathy.
arXiv Detail & Related papers (2021-10-08T07:44:47Z) - Exemplars-guided Empathetic Response Generation Controlled by the
Elements of Human Communication [88.52901763928045]
We propose an approach that relies on exemplars to cue the generative model on fine stylistic properties that signal empathy to the interlocutor.
We empirically show that these approaches yield significant improvements in empathetic response quality in terms of both automated and human-evaluated metrics.
arXiv Detail & Related papers (2021-06-22T14:02:33Z) - Cognitive architecture aided by working-memory for self-supervised
multi-modal humans recognition [54.749127627191655]
The ability to recognize human partners is an important social skill to build personalized and long-term human-robot interactions.
Deep learning networks have achieved state-of-the-art results and demonstrated to be suitable tools to address such a task.
One solution is to make robots learn from their first-hand sensory data with self-supervision.
arXiv Detail & Related papers (2021-03-16T13:50:24Z) - Affect-Driven Modelling of Robot Personality for Collaborative
Human-Robot Interactions [16.40684407420441]
Collaborative interactions require social robots to adapt to the dynamics of human affective behaviour.
We propose a novel framework for personality-driven behaviour generation in social robots.
arXiv Detail & Related papers (2020-10-14T16:34:14Z) - Joint Mind Modeling for Explanation Generation in Complex Human-Robot
Collaborative Tasks [83.37025218216888]
We propose a novel explainable AI (XAI) framework for achieving human-like communication in human-robot collaborations.
The robot builds a hierarchical mind model of the human user and generates explanations of its own mind as a form of communications.
Results show that the generated explanations of our approach significantly improves the collaboration performance and user perception of the robot.
arXiv Detail & Related papers (2020-07-24T23:35:03Z) - Learning Predictive Models From Observation and Interaction [137.77887825854768]
Learning predictive models from interaction with the world allows an agent, such as a robot, to learn about how the world works.
However, learning a model that captures the dynamics of complex skills represents a major challenge.
We propose a method to augment the training set with observational data of other agents, such as humans.
arXiv Detail & Related papers (2019-12-30T01:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.