Developing Social Robots with Empathetic Non-Verbal Cues Using Large
Language Models
- URL: http://arxiv.org/abs/2308.16529v1
- Date: Thu, 31 Aug 2023 08:20:04 GMT
- Title: Developing Social Robots with Empathetic Non-Verbal Cues Using Large
Language Models
- Authors: Yoon Kyung Lee, Yoonwon Jung, Gyuyi Kang, Sowon Hahn
- Abstract summary: We design and label four types of empathetic non-verbal cues, abbreviated as SAFE: Speech, Action (gesture), Facial expression, and Emotion, in a social robot.
Preliminary results show distinct patterns in the robot's responses, such as a preference for calm and positive social emotions like 'joy' and 'lively', and frequent nodding gestures.
Our work lays the groundwork for future studies on human-robot interactions, emphasizing the essential role of both verbal and non-verbal cues in creating social and empathetic robots.
- Score: 2.5489046505746704
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We propose augmenting the empathetic capacities of social robots by
integrating non-verbal cues. Our primary contribution is the design and
labeling of four types of empathetic non-verbal cues, abbreviated as SAFE:
Speech, Action (gesture), Facial expression, and Emotion, in a social robot.
These cues are generated using a Large Language Model (LLM). We developed an
LLM-based conversational system for the robot and assessed its alignment with
social cues as defined by human counselors. Preliminary results show distinct
patterns in the robot's responses, such as a preference for calm and positive
social emotions like 'joy' and 'lively', and frequent nodding gestures. Despite
these tendencies, our approach has led to the development of a social robot
capable of context-aware and more authentic interactions. Our work lays the
groundwork for future studies on human-robot interactions, emphasizing the
essential role of both verbal and non-verbal cues in creating social and
empathetic robots.
Related papers
- Survey of Design Paradigms for Social Robots [10.618592615516901]
Social robots leverage multimodal communication, incorporating speech, facial expressions, and gestures to enhance user engagement and emotional support.
The understanding of design paradigms of social robots is obstructed by the complexity of the system and the necessity to tune it to a specific task.
This article provides a structured review of social robot design paradigms, categorizing them into cognitive architectures, role design models, linguistic models, communication flow, activity system models, and integrated design models.
arXiv Detail & Related papers (2024-07-30T05:22:31Z) - Real-time Addressee Estimation: Deployment of a Deep-Learning Model on
the iCub Robot [52.277579221741746]
Addressee Estimation is a skill essential for social robots to interact smoothly with humans.
Inspired by human perceptual skills, a deep-learning model for Addressee Estimation is designed, trained, and deployed on an iCub robot.
The study presents the procedure of such implementation and the performance of the model deployed in real-time human-robot interaction.
arXiv Detail & Related papers (2023-11-09T13:01:21Z) - To Whom are You Talking? A Deep Learning Model to Endow Social Robots with Addressee Estimation Skills [44.497086629717074]
We tackle the problem of Addressee Estimation, the ability to understand an utterance's addressee, by interpreting and exploiting non-verbal bodily cues from the speaker.
We do so by implementing an hybrid deep learning model composed of convolutional layers and LSTM cells taking as input images portraying the face of the speaker and 2D vectors of the speaker's body posture.
We demonstrate that our model is able to solve the Addressee Estimation problem in terms of addressee localisation in space, from a robot ego-centric point of view.
arXiv Detail & Related papers (2023-08-21T14:43:42Z) - Social Assistive Robotics for Autistic Children [56.524774292536264]
The goal of the project is testing autistic children's interactions with the social robot NAO.
The innovative aspect of the project is that the children robot interaction will consider the children's emotions and specific features.
arXiv Detail & Related papers (2022-09-25T18:28:19Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - Forecasting Nonverbal Social Signals during Dyadic Interactions with
Generative Adversarial Neural Networks [0.0]
Successful social interaction is closely coupled with the interplay between nonverbal perception and action mechanisms.
Nonverbal gestures are expected to endow social robots with the capability of emphasizing their speech, or showing their intentions.
Our research sheds a light on modeling human behaviors in social interactions, specifically, forecasting human nonverbal social signals during dyadic interactions.
arXiv Detail & Related papers (2021-10-18T15:01:32Z) - A MultiModal Social Robot Toward Personalized Emotion Interaction [1.2183405753834562]
This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy.
The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework.
arXiv Detail & Related papers (2021-10-08T00:35:44Z) - Disambiguating Affective Stimulus Associations for Robot Perception and
Dialogue [67.89143112645556]
We provide a NICO robot with the ability to learn the associations between a perceived auditory stimulus and an emotional expression.
NICO is able to do this for both individual subjects and specific stimuli, with the aid of an emotion-driven dialogue system.
The robot is then able to use this information to determine a subject's enjoyment of perceived auditory stimuli in a real HRI scenario.
arXiv Detail & Related papers (2021-03-05T20:55:48Z) - Affect-Driven Modelling of Robot Personality for Collaborative
Human-Robot Interactions [16.40684407420441]
Collaborative interactions require social robots to adapt to the dynamics of human affective behaviour.
We propose a novel framework for personality-driven behaviour generation in social robots.
arXiv Detail & Related papers (2020-10-14T16:34:14Z) - Joint Mind Modeling for Explanation Generation in Complex Human-Robot
Collaborative Tasks [83.37025218216888]
We propose a novel explainable AI (XAI) framework for achieving human-like communication in human-robot collaborations.
The robot builds a hierarchical mind model of the human user and generates explanations of its own mind as a form of communications.
Results show that the generated explanations of our approach significantly improves the collaboration performance and user perception of the robot.
arXiv Detail & Related papers (2020-07-24T23:35:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.