Warmth and Competence to Predict Human Preference of Robot Behavior in
Physical Human-Robot Interaction
- URL: http://arxiv.org/abs/2008.05799v1
- Date: Thu, 13 Aug 2020 10:19:47 GMT
- Title: Warmth and Competence to Predict Human Preference of Robot Behavior in
Physical Human-Robot Interaction
- Authors: Marcus M. Scheunemann and Raymond H. Cuijpers and Christoph Salge
- Abstract summary: Social cognition posits that the dimensions Warmth and Competence are central and universal dimensions characterizing other humans.
The Robotic Social Attribute Scale (RoSAS) proposes items for those dimensions suitable for HRI and validated them in a visual observation study.
We found that Warmth and Competence, among all RoSAS and Godspeed dimensions, are the most important predictors for human preferences between different robot behaviors.
- Score: 0.8594140167290099
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A solid methodology to understand human perception and preferences in
human-robot interaction (HRI) is crucial in designing real-world HRI. Social
cognition posits that the dimensions Warmth and Competence are central and
universal dimensions characterizing other humans. The Robotic Social Attribute
Scale (RoSAS) proposes items for those dimensions suitable for HRI and
validated them in a visual observation study. In this paper we complement the
validation by showing the usability of these dimensions in a behavior based,
physical HRI study with a fully autonomous robot. We compare the findings with
the popular Godspeed dimensions Animacy, Anthropomorphism, Likeability,
Perceived Intelligence and Perceived Safety. We found that Warmth and
Competence, among all RoSAS and Godspeed dimensions, are the most important
predictors for human preferences between different robot behaviors. This
predictive power holds even when there is no clear consensus preference or
significant factor difference between conditions.
Related papers
- Robot Interaction Behavior Generation based on Social Motion Forecasting for Human-Robot Interaction [9.806227900768926]
We propose to model social motion forecasting in a shared human-robot representation space.
ECHO operates in the aforementioned shared space to predict the future motions of the agents encountered in social scenarios.
We evaluate our model in multi-person and human-robot motion forecasting tasks and obtain state-of-the-art performance by a large margin.
arXiv Detail & Related papers (2024-02-07T11:37:14Z) - Real-time Addressee Estimation: Deployment of a Deep-Learning Model on
the iCub Robot [52.277579221741746]
Addressee Estimation is a skill essential for social robots to interact smoothly with humans.
Inspired by human perceptual skills, a deep-learning model for Addressee Estimation is designed, trained, and deployed on an iCub robot.
The study presents the procedure of such implementation and the performance of the model deployed in real-time human-robot interaction.
arXiv Detail & Related papers (2023-11-09T13:01:21Z) - Habitat 3.0: A Co-Habitat for Humans, Avatars and Robots [119.55240471433302]
Habitat 3.0 is a simulation platform for studying collaborative human-robot tasks in home environments.
It addresses challenges in modeling complex deformable bodies and diversity in appearance and motion.
Human-in-the-loop infrastructure enables real human interaction with simulated robots via mouse/keyboard or a VR interface.
arXiv Detail & Related papers (2023-10-19T17:29:17Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - AAAI SSS-22 Symposium on Closing the Assessment Loop: Communicating
Proficiency and Intent in Human-Robot Teaming [4.787322716745613]
How should a robot convey predicted ability on a new task?
How should a robot adapt its proficiency criteria based on human intentions and values?
There are no agreed upon standards for evaluating proficiency and intent-based interactions.
arXiv Detail & Related papers (2022-04-05T18:28:01Z) - Ergonomically Intelligent Physical Human-Robot Interaction: Postural
Estimation, Assessment, and Optimization [3.681892767755111]
We show that we can estimate human posture solely from the trajectory of the interacting robot.
We propose DULA, a differentiable ergonomics model, and use it in gradient-free postural optimization for physical human-robot interaction tasks.
arXiv Detail & Related papers (2021-08-12T21:13:06Z) - Probabilistic Human Motion Prediction via A Bayesian Neural Network [71.16277790708529]
We propose a probabilistic model for human motion prediction in this paper.
Our model could generate several future motions when given an observed motion sequence.
We extensively validate our approach on a large scale benchmark dataset Human3.6m.
arXiv Detail & Related papers (2021-07-14T09:05:33Z) - AGENT: A Benchmark for Core Psychological Reasoning [60.35621718321559]
Intuitive psychology is the ability to reason about hidden mental variables that drive observable actions.
Despite recent interest in machine agents that reason about other agents, it is not clear if such agents learn or hold the core psychology principles that drive human reasoning.
We present a benchmark consisting of procedurally generated 3D animations, AGENT, structured around four scenarios.
arXiv Detail & Related papers (2021-02-24T14:58:23Z) - Motion Planning Combines Psychological Safety and Motion Prediction for
a Sense Motive Robot [2.14239637027446]
This paper addresses the human safety issue by covering both the physical safety and psychological safety aspects.
First, we introduce an adaptive robot velocity control and step size adjustment method according to human facial expressions, such that the robot can adjust its movement to keep safety when the human emotion is unusual.
Second, we predict the human motion by detecting the suddenly changes of human head pose and gaze direction, such that the robot can infer whether the human attention is distracted, predict the next move of human and rebuild a repulsive force to avoid potential collision.
arXiv Detail & Related papers (2020-09-29T04:19:53Z) - Human Grasp Classification for Reactive Human-to-Robot Handovers [50.91803283297065]
We propose an approach for human-to-robot handovers in which the robot meets the human halfway.
We collect a human grasp dataset which covers typical ways of holding objects with various hand shapes and poses.
We present a planning and execution approach that takes the object from the human hand according to the detected grasp and hand position.
arXiv Detail & Related papers (2020-03-12T19:58:03Z) - Human Perception of Intrinsically Motivated Autonomy in Human-Robot
Interaction [2.485182034310304]
A challenge in using robots in human-inhabited environments is to design behavior that is engaging, yet robust to the perturbations induced by human interaction.
Our idea is to imbue the robot with intrinsic motivation (IM) so that it can handle new situations and appears as a genuine social other to humans.
This article presents a "robotologist" study design that allows comparing autonomously generated behaviors with each other.
arXiv Detail & Related papers (2020-02-14T09:49:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.