Nonverbal Cues in Human-Robot Interaction: A Communication Studies
Perspective
- URL: http://arxiv.org/abs/2304.11293v1
- Date: Sat, 22 Apr 2023 02:15:48 GMT
- Title: Nonverbal Cues in Human-Robot Interaction: A Communication Studies
Perspective
- Authors: Jacqueline Urakami, Katie Seaborn
- Abstract summary: Communication between people is characterized by a broad range of nonverbal cues.
We offer definitive nonverbal codes for human-robot interaction (HRI)
We argue that integrating robotic nonverbal codes in HRI will afford robots a feeling of "aliveness" or "social agency"
- Score: 19.67112387337872
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Communication between people is characterized by a broad range of nonverbal
cues. Transferring these cues into the design of robots and other artificial
agents that interact with people may foster more natural, inviting, and
accessible experiences. In this position paper, we offer a series of definitive
nonverbal codes for human-robot interaction (HRI) that address the five human
sensory systems (visual, auditory, haptic, olfactory, gustatory) drawn from the
field of communication studies. We discuss how these codes can be translated
into design patterns for HRI using a curated sample of the communication
studies and HRI literatures. As nonverbal codes are an essential mode in human
communication, we argue that integrating robotic nonverbal codes in HRI will
afford robots a feeling of "aliveness" or "social agency" that would otherwise
be missing. We end with suggestions for research directions to stimulate work
on nonverbal communication within the field of HRI and improve communication
between human and robots.
Related papers
- Nonverbal Interaction Detection [83.40522919429337]
This work addresses a new challenge of understanding human nonverbal interaction in social contexts.
We contribute a novel large-scale dataset, called NVI, which is meticulously annotated to include bounding boxes for humans and corresponding social groups.
Second, we establish a new task NVI-DET for nonverbal interaction detection, which is formalized as identifying triplets in the form individual, group, interaction> from images.
Third, we propose a nonverbal interaction detection hypergraph (NVI-DEHR), a new approach that explicitly models high-order nonverbal interactions using hypergraphs.
arXiv Detail & Related papers (2024-07-11T02:14:06Z) - A Human-Robot Mutual Learning System with Affect-Grounded Language
Acquisition and Differential Outcomes Training [0.1812164955222814]
The paper presents a novel human-robot interaction setup for identifying robot homeostatic needs.
We adopted a differential outcomes training protocol whereby the robot provides feedback specific to its internal needs.
We found evidence that DOT can enhance the human's learning efficiency, which in turn enables more efficient robot language acquisition.
arXiv Detail & Related papers (2023-10-20T09:41:31Z) - Robots with Different Embodiments Can Express and Influence Carefulness
in Object Manipulation [104.5440430194206]
This work investigates the perception of object manipulations performed with a communicative intent by two robots.
We designed the robots' movements to communicate carefulness or not during the transportation of objects.
arXiv Detail & Related papers (2022-08-03T13:26:52Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - Forecasting Nonverbal Social Signals during Dyadic Interactions with
Generative Adversarial Neural Networks [0.0]
Successful social interaction is closely coupled with the interplay between nonverbal perception and action mechanisms.
Nonverbal gestures are expected to endow social robots with the capability of emphasizing their speech, or showing their intentions.
Our research sheds a light on modeling human behaviors in social interactions, specifically, forecasting human nonverbal social signals during dyadic interactions.
arXiv Detail & Related papers (2021-10-18T15:01:32Z) - A MultiModal Social Robot Toward Personalized Emotion Interaction [1.2183405753834562]
This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy.
The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework.
arXiv Detail & Related papers (2021-10-08T00:35:44Z) - Few-shot Language Coordination by Modeling Theory of Mind [95.54446989205117]
We study the task of few-shot $textitlanguage coordination$.
We require the lead agent to coordinate with a $textitpopulation$ of agents with different linguistic abilities.
This requires the ability to model the partner's beliefs, a vital component of human communication.
arXiv Detail & Related papers (2021-07-12T19:26:11Z) - Let's be friends! A rapport-building 3D embodied conversational agent
for the Human Support Robot [0.0]
Partial subtle mirroring of nonverbal behaviors during conversations (also known as mimicking or parallel empathy) is essential for rapport building.
Our research question is whether integrating an ECA able to mirror its interlocutor's facial expressions and head movements with a human-service robot will improve the user's experience.
Our contribution is the complex integration of an expressive ECA, able to track its interlocutor's face, and to mirror his/her facial expressions and head movements in real time, integrated with a human support robot.
arXiv Detail & Related papers (2021-03-08T01:02:41Z) - Disambiguating Affective Stimulus Associations for Robot Perception and
Dialogue [67.89143112645556]
We provide a NICO robot with the ability to learn the associations between a perceived auditory stimulus and an emotional expression.
NICO is able to do this for both individual subjects and specific stimuli, with the aid of an emotion-driven dialogue system.
The robot is then able to use this information to determine a subject's enjoyment of perceived auditory stimuli in a real HRI scenario.
arXiv Detail & Related papers (2021-03-05T20:55:48Z) - Joint Mind Modeling for Explanation Generation in Complex Human-Robot
Collaborative Tasks [83.37025218216888]
We propose a novel explainable AI (XAI) framework for achieving human-like communication in human-robot collaborations.
The robot builds a hierarchical mind model of the human user and generates explanations of its own mind as a form of communications.
Results show that the generated explanations of our approach significantly improves the collaboration performance and user perception of the robot.
arXiv Detail & Related papers (2020-07-24T23:35:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.