The role of haptic communication in dyadic collaborative object
manipulation tasks
- URL: http://arxiv.org/abs/2203.01287v1
- Date: Wed, 2 Mar 2022 18:13:54 GMT
- Title: The role of haptic communication in dyadic collaborative object
manipulation tasks
- Authors: Yiming Liu, Raz Leib, William Dudley, Ali Shafti, A. Aldo Faisal,
David W. Franklin
- Abstract summary: We investigate the role of haptics in human collaborative physical tasks.
We present a task to balance a ball at a target position on a board.
We find that humans can better coordinate with one another when haptic feedback is available.
- Score: 6.46682752231823
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Intuitive and efficient physical human-robot collaboration relies on the
mutual observability of the human and the robot, i.e. the two entities being
able to interpret each other's intentions and actions. This is remedied by a
myriad of methods involving human sensing or intention decoding, as well as
human-robot turn-taking and sequential task planning. However, the physical
interaction establishes a rich channel of communication through forces, torques
and haptics in general, which is often overlooked in industrial implementations
of human-robot interaction. In this work, we investigate the role of haptics in
human collaborative physical tasks, to identify how to integrate physical
communication in human-robot teams. We present a task to balance a ball at a
target position on a board either bimanually by one participant, or dyadically
by two participants, with and without haptic information. The task requires
that the two sides coordinate with each other, in real-time, to balance the
ball at the target. We found that with training the completion time and number
of velocity peaks of the ball decreased, and that participants gradually became
consistent in their braking strategy. Moreover we found that the presence of
haptic information improved the performance (decreased completion time) and led
to an increase in overall cooperative movements. Overall, our results show that
humans can better coordinate with one another when haptic feedback is
available. These results also highlight the likely importance of haptic
communication in human-robot physical interaction, both as a tool to infer
human intentions and to make the robot behaviour interpretable to humans.
Related papers
- Dreaming to Assist: Learning to Align with Human Objectives for Shared Control in High-Speed Racing [10.947581892636629]
Tight coordination is required for effective human-robot teams in domains involving fast dynamics and tactical decisions.
We present Dream2Assist, a framework that combines a rich world model able to infer human objectives and value functions.
We show that the combined human-robot team, when blending its actions with those of the human, outperforms the synthetic humans alone.
arXiv Detail & Related papers (2024-10-14T01:00:46Z) - HARMONIC: Cognitive and Control Collaboration in Human-Robotic Teams [0.0]
We demonstrate a cognitive strategy for robots in human-robot teams that incorporates metacognition, natural language communication, and explainability.
The system is embodied using the HARMONIC architecture that flexibly integrates cognitive and control capabilities.
arXiv Detail & Related papers (2024-09-26T16:48:21Z) - Habitat 3.0: A Co-Habitat for Humans, Avatars and Robots [119.55240471433302]
Habitat 3.0 is a simulation platform for studying collaborative human-robot tasks in home environments.
It addresses challenges in modeling complex deformable bodies and diversity in appearance and motion.
Human-in-the-loop infrastructure enables real human interaction with simulated robots via mouse/keyboard or a VR interface.
arXiv Detail & Related papers (2023-10-19T17:29:17Z) - HODN: Disentangling Human-Object Feature for HOI Detection [51.48164941412871]
We propose a Human and Object Disentangling Network (HODN) to model the Human-Object Interaction (HOI) relationships explicitly.
Considering that human features are more contributive to interaction, we propose a Human-Guide Linking method to make sure the interaction decoder focuses on the human-centric regions.
Our proposed method achieves competitive performance on both the V-COCO and the HICO-Det Linking datasets.
arXiv Detail & Related papers (2023-08-20T04:12:50Z) - Robots with Different Embodiments Can Express and Influence Carefulness
in Object Manipulation [104.5440430194206]
This work investigates the perception of object manipulations performed with a communicative intent by two robots.
We designed the robots' movements to communicate carefulness or not during the transportation of objects.
arXiv Detail & Related papers (2022-08-03T13:26:52Z) - Show Me What You Can Do: Capability Calibration on Reachable Workspace
for Human-Robot Collaboration [83.4081612443128]
We show that a short calibration using REMP can effectively bridge the gap between what a non-expert user thinks a robot can reach and the ground-truth.
We show that this calibration procedure not only results in better user perception, but also promotes more efficient human-robot collaborations.
arXiv Detail & Related papers (2021-03-06T09:14:30Z) - Joint Mind Modeling for Explanation Generation in Complex Human-Robot
Collaborative Tasks [83.37025218216888]
We propose a novel explainable AI (XAI) framework for achieving human-like communication in human-robot collaborations.
The robot builds a hierarchical mind model of the human user and generates explanations of its own mind as a form of communications.
Results show that the generated explanations of our approach significantly improves the collaboration performance and user perception of the robot.
arXiv Detail & Related papers (2020-07-24T23:35:03Z) - Human Grasp Classification for Reactive Human-to-Robot Handovers [50.91803283297065]
We propose an approach for human-to-robot handovers in which the robot meets the human halfway.
We collect a human grasp dataset which covers typical ways of holding objects with various hand shapes and poses.
We present a planning and execution approach that takes the object from the human hand according to the detected grasp and hand position.
arXiv Detail & Related papers (2020-03-12T19:58:03Z) - Human-robot co-manipulation of extended objects: Data-driven models and
control from analysis of human-human dyads [2.7036498789349244]
We use data from human-human dyad experiments to determine motion intent which we use for a physical human-robot co-manipulation task.
We develop a deep neural network based on motion data from human-human trials to predict human intent based on past motion.
arXiv Detail & Related papers (2020-01-03T21:23:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.