Affect-Driven Modelling of Robot Personality for Collaborative
Human-Robot Interactions
- URL: http://arxiv.org/abs/2010.07221v2
- Date: Fri, 25 Feb 2022 11:09:10 GMT
- Title: Affect-Driven Modelling of Robot Personality for Collaborative
Human-Robot Interactions
- Authors: Nikhil Churamani and Pablo Barros and Hatice Gunes and Stefan Wermter
- Abstract summary: Collaborative interactions require social robots to adapt to the dynamics of human affective behaviour.
We propose a novel framework for personality-driven behaviour generation in social robots.
- Score: 16.40684407420441
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Collaborative interactions require social robots to adapt to the dynamics of
human affective behaviour. Yet, current approaches for affective behaviour
generation in robots focus on instantaneous perception to generate a one-to-one
mapping between observed human expressions and static robot actions. In this
paper, we propose a novel framework for personality-driven behaviour generation
in social robots. The framework consists of (i) a hybrid neural model for
evaluating facial expressions and speech, forming intrinsic affective
representations in the robot, (ii) an Affective Core, that employs
self-organising neural models to embed robot personality traits like patience
and emotional actuation, and (iii) a Reinforcement Learning model that uses the
robot's affective appraisal to learn interaction behaviour. For evaluation, we
conduct a user study (n = 31) where the NICO robot acts as a proposer in the
Ultimatum Game. The effect of robot personality on its negotiation strategy is
witnessed by participants, who rank a patient robot with high emotional
actuation higher on persistence, while an inert and impatient robot higher on
its generosity and altruistic behaviour.
Related papers
- Robot Interaction Behavior Generation based on Social Motion Forecasting for Human-Robot Interaction [9.806227900768926]
We propose to model social motion forecasting in a shared human-robot representation space.
ECHO operates in the aforementioned shared space to predict the future motions of the agents encountered in social scenarios.
We evaluate our model in multi-person and human-robot motion forecasting tasks and obtain state-of-the-art performance by a large margin.
arXiv Detail & Related papers (2024-02-07T11:37:14Z) - Real-time Addressee Estimation: Deployment of a Deep-Learning Model on
the iCub Robot [52.277579221741746]
Addressee Estimation is a skill essential for social robots to interact smoothly with humans.
Inspired by human perceptual skills, a deep-learning model for Addressee Estimation is designed, trained, and deployed on an iCub robot.
The study presents the procedure of such implementation and the performance of the model deployed in real-time human-robot interaction.
arXiv Detail & Related papers (2023-11-09T13:01:21Z) - Habitat 3.0: A Co-Habitat for Humans, Avatars and Robots [119.55240471433302]
Habitat 3.0 is a simulation platform for studying collaborative human-robot tasks in home environments.
It addresses challenges in modeling complex deformable bodies and diversity in appearance and motion.
Human-in-the-loop infrastructure enables real human interaction with simulated robots via mouse/keyboard or a VR interface.
arXiv Detail & Related papers (2023-10-19T17:29:17Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - Synthesis and Execution of Communicative Robotic Movements with
Generative Adversarial Networks [59.098560311521034]
We focus on how to transfer on two different robotic platforms the same kinematics modulation that humans adopt when manipulating delicate objects.
We choose to modulate the velocity profile adopted by the robots' end-effector, inspired by what humans do when transporting objects with different characteristics.
We exploit a novel Generative Adversarial Network architecture, trained with human kinematics examples, to generalize over them and generate new and meaningful velocity profiles.
arXiv Detail & Related papers (2022-03-29T15:03:05Z) - A MultiModal Social Robot Toward Personalized Emotion Interaction [1.2183405753834562]
This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy.
The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework.
arXiv Detail & Related papers (2021-10-08T00:35:44Z) - Show Me What You Can Do: Capability Calibration on Reachable Workspace
for Human-Robot Collaboration [83.4081612443128]
We show that a short calibration using REMP can effectively bridge the gap between what a non-expert user thinks a robot can reach and the ground-truth.
We show that this calibration procedure not only results in better user perception, but also promotes more efficient human-robot collaborations.
arXiv Detail & Related papers (2021-03-06T09:14:30Z) - Controlling the Sense of Agency in Dyadic Robot Interaction: An Active
Inference Approach [6.421670116083633]
We examine dyadic imitative interactions of robots using a variational recurrent neural network model.
We examined how regulating the complexity term to minimize free energy during training determines the dynamic characteristics of networks.
arXiv Detail & Related papers (2021-03-03T02:38:09Z) - Joint Mind Modeling for Explanation Generation in Complex Human-Robot
Collaborative Tasks [83.37025218216888]
We propose a novel explainable AI (XAI) framework for achieving human-like communication in human-robot collaborations.
The robot builds a hierarchical mind model of the human user and generates explanations of its own mind as a form of communications.
Results show that the generated explanations of our approach significantly improves the collaboration performance and user perception of the robot.
arXiv Detail & Related papers (2020-07-24T23:35:03Z) - Human Perception of Intrinsically Motivated Autonomy in Human-Robot
Interaction [2.485182034310304]
A challenge in using robots in human-inhabited environments is to design behavior that is engaging, yet robust to the perturbations induced by human interaction.
Our idea is to imbue the robot with intrinsic motivation (IM) so that it can handle new situations and appears as a genuine social other to humans.
This article presents a "robotologist" study design that allows comparing autonomously generated behaviors with each other.
arXiv Detail & Related papers (2020-02-14T09:49:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.