Singing the Body Electric: The Impact of Robot Embodiment on User
Expectations
- URL: http://arxiv.org/abs/2401.06977v1
- Date: Sat, 13 Jan 2024 04:42:48 GMT
- Title: Singing the Body Electric: The Impact of Robot Embodiment on User
Expectations
- Authors: Nathaniel Dennler, Stefanos Nikolaidis, Maja Matari\'c
- Abstract summary: Users develop mental models of robots to conceptualize what kind of interactions they can have with those robots.
conceptualizations are often formed before interactions with the robot and are based only on observing the robot's physical design.
We propose to use multimodal features of robot embodiments to predict what kinds of expectations users will have about a given robot's social and physical capabilities.
- Score: 7.408858358967414
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Users develop mental models of robots to conceptualize what kind of
interactions they can have with those robots. The conceptualizations are often
formed before interactions with the robot and are based only on observing the
robot's physical design. As a result, understanding conceptualizations formed
from physical design is necessary to understand how users intend to interact
with the robot. We propose to use multimodal features of robot embodiments to
predict what kinds of expectations users will have about a given robot's social
and physical capabilities. We show that using such features provides
information about general mental models of the robots that generalize across
socially interactive robots. We describe how these models can be incorporated
into interaction design and physical design for researchers working with
socially interactive robots.
Related papers
- Survey of Design Paradigms for Social Robots [10.618592615516901]
Social robots leverage multimodal communication, incorporating speech, facial expressions, and gestures to enhance user engagement and emotional support.
The understanding of design paradigms of social robots is obstructed by the complexity of the system and the necessity to tune it to a specific task.
This article provides a structured review of social robot design paradigms, categorizing them into cognitive architectures, role design models, linguistic models, communication flow, activity system models, and integrated design models.
arXiv Detail & Related papers (2024-07-30T05:22:31Z) - Unifying 3D Representation and Control of Diverse Robots with a Single Camera [48.279199537720714]
We introduce Neural Jacobian Fields, an architecture that autonomously learns to model and control robots from vision alone.
Our approach achieves accurate closed-loop control and recovers the causal dynamic structure of each robot.
arXiv Detail & Related papers (2024-07-11T17:55:49Z) - Mining multi-modal communication patterns in interaction with
explainable and non-explainable robots [0.138120109831448]
We investigate interaction patterns for humans interacting with explainable and non-explainable robots.
We video recorded and analyzed human behavior during a board game, where 20 humans verbally instructed either an explainable or non-explainable Pepper robot to move objects on the board.
arXiv Detail & Related papers (2023-12-22T12:12:55Z) - Social Assistive Robotics for Autistic Children [56.524774292536264]
The goal of the project is testing autistic children's interactions with the social robot NAO.
The innovative aspect of the project is that the children robot interaction will consider the children's emotions and specific features.
arXiv Detail & Related papers (2022-09-25T18:28:19Z) - Robots with Different Embodiments Can Express and Influence Carefulness
in Object Manipulation [104.5440430194206]
This work investigates the perception of object manipulations performed with a communicative intent by two robots.
We designed the robots' movements to communicate carefulness or not during the transportation of objects.
arXiv Detail & Related papers (2022-08-03T13:26:52Z) - Explain yourself! Effects of Explanations in Human-Robot Interaction [10.389325878657697]
Explanations of robot decisions could affect user perceptions, justify their reliability, and increase trust.
The effects on human perceptions of robots that explain their decisions have not been studied thoroughly.
This study demonstrates the need for and potential of explainable human-robot interaction.
arXiv Detail & Related papers (2022-04-09T15:54:27Z) - Synthesis and Execution of Communicative Robotic Movements with
Generative Adversarial Networks [59.098560311521034]
We focus on how to transfer on two different robotic platforms the same kinematics modulation that humans adopt when manipulating delicate objects.
We choose to modulate the velocity profile adopted by the robots' end-effector, inspired by what humans do when transporting objects with different characteristics.
We exploit a novel Generative Adversarial Network architecture, trained with human kinematics examples, to generalize over them and generate new and meaningful velocity profiles.
arXiv Detail & Related papers (2022-03-29T15:03:05Z) - Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and
Robotics Together [68.44697646919515]
This paper presents several human-robot systems that utilize spatial computing to enable novel robot use cases.
The combination of spatial computing and egocentric sensing on mixed reality devices enables them to capture and understand human actions and translate these to actions with spatial meaning.
arXiv Detail & Related papers (2022-02-03T10:04:26Z) - Sensorimotor representation learning for an "active self" in robots: A
model survey [10.649413494649293]
In humans, these capabilities are thought to be related to our ability to perceive our body in space.
This paper reviews the developmental processes of underlying mechanisms of these abilities.
We propose a theoretical computational framework, which aims to allow the emergence of the sense of self in artificial agents.
arXiv Detail & Related papers (2020-11-25T16:31:01Z) - Affect-Driven Modelling of Robot Personality for Collaborative
Human-Robot Interactions [16.40684407420441]
Collaborative interactions require social robots to adapt to the dynamics of human affective behaviour.
We propose a novel framework for personality-driven behaviour generation in social robots.
arXiv Detail & Related papers (2020-10-14T16:34:14Z) - Joint Mind Modeling for Explanation Generation in Complex Human-Robot
Collaborative Tasks [83.37025218216888]
We propose a novel explainable AI (XAI) framework for achieving human-like communication in human-robot collaborations.
The robot builds a hierarchical mind model of the human user and generates explanations of its own mind as a form of communications.
Results show that the generated explanations of our approach significantly improves the collaboration performance and user perception of the robot.
arXiv Detail & Related papers (2020-07-24T23:35:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.