Motion Planning Combines Psychological Safety and Motion Prediction for
a Sense Motive Robot
- URL: http://arxiv.org/abs/2010.11671v2
- Date: Fri, 23 Oct 2020 15:32:08 GMT
- Title: Motion Planning Combines Psychological Safety and Motion Prediction for
a Sense Motive Robot
- Authors: Hejing Ling, Guoliang Liu, Guohui Tian
- Abstract summary: This paper addresses the human safety issue by covering both the physical safety and psychological safety aspects.
First, we introduce an adaptive robot velocity control and step size adjustment method according to human facial expressions, such that the robot can adjust its movement to keep safety when the human emotion is unusual.
Second, we predict the human motion by detecting the suddenly changes of human head pose and gaze direction, such that the robot can infer whether the human attention is distracted, predict the next move of human and rebuild a repulsive force to avoid potential collision.
- Score: 2.14239637027446
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Human safety is the most important demand for human robot interaction and
collaboration (HRIC), which not only refers to physical safety, but also
includes psychological safety. Although many robots with different
configurations have entered our living and working environments, the human
safety problem is still an ongoing research problem in human-robot coexistence
scenarios. This paper addresses the human safety issue by covering both the
physical safety and psychological safety aspects. First, we introduce an
adaptive robot velocity control and step size adjustment method according to
human facial expressions, such that the robot can adjust its movement to keep
safety when the human emotion is unusual. Second, we predict the human motion
by detecting the suddenly changes of human head pose and gaze direction, such
that the robot can infer whether the human attention is distracted, predict the
next move of human and rebuild a repulsive force to avoid potential collision.
Finally, we demonstrate our idea using a 7 DOF TIAGo robot in a dynamic HRIC
environment, which shows that the robot becomes sense motive, and responds to
human action and emotion changes quickly and efficiently.
Related papers
- Real-Time Dynamic Robot-Assisted Hand-Object Interaction via Motion Primitives [45.256762954338704]
We propose an approach to enhancing physical HRI with a focus on dynamic robot-assisted hand-object interaction.
We employ a transformer-based algorithm to perform real-time 3D modeling of human hands from single RGB images.
The robot's action implementation is dynamically fine-tuned using the continuously updated 3D hand models.
arXiv Detail & Related papers (2024-05-29T21:20:16Z) - Guessing human intentions to avoid dangerous situations in caregiving robots [1.3546242205182986]
We propose an algorithm that detects risky situations for humans, selecting a robot action that removes the danger in real time.
We use the simulation-based approach to ATM and adopt the 'like-me' policy to assign intentions and actions to people.
The algorithm has been implemented as part of an existing cognitive architecture and tested in simulation scenarios.
arXiv Detail & Related papers (2024-03-24T20:43:29Z) - Habitat 3.0: A Co-Habitat for Humans, Avatars and Robots [119.55240471433302]
Habitat 3.0 is a simulation platform for studying collaborative human-robot tasks in home environments.
It addresses challenges in modeling complex deformable bodies and diversity in appearance and motion.
Human-in-the-loop infrastructure enables real human interaction with simulated robots via mouse/keyboard or a VR interface.
arXiv Detail & Related papers (2023-10-19T17:29:17Z) - HERD: Continuous Human-to-Robot Evolution for Learning from Human
Demonstration [57.045140028275036]
We show that manipulation skills can be transferred from a human to a robot through the use of micro-evolutionary reinforcement learning.
We propose an algorithm for multi-dimensional evolution path searching that allows joint optimization of both the robot evolution path and the policy.
arXiv Detail & Related papers (2022-12-08T15:56:13Z) - Robots with Different Embodiments Can Express and Influence Carefulness
in Object Manipulation [104.5440430194206]
This work investigates the perception of object manipulations performed with a communicative intent by two robots.
We designed the robots' movements to communicate carefulness or not during the transportation of objects.
arXiv Detail & Related papers (2022-08-03T13:26:52Z) - From Movement Kinematics to Object Properties: Online Recognition of
Human Carefulness [112.28757246103099]
We show how a robot can infer online, from vision alone, whether or not the human partner is careful when moving an object.
We demonstrated that a humanoid robot could perform this inference with high accuracy (up to 81.3%) even with a low-resolution camera.
The prompt recognition of movement carefulness from observing the partner's action will allow robots to adapt their actions on the object to show the same degree of care as their human partners.
arXiv Detail & Related papers (2021-09-01T16:03:13Z) - Show Me What You Can Do: Capability Calibration on Reachable Workspace
for Human-Robot Collaboration [83.4081612443128]
We show that a short calibration using REMP can effectively bridge the gap between what a non-expert user thinks a robot can reach and the ground-truth.
We show that this calibration procedure not only results in better user perception, but also promotes more efficient human-robot collaborations.
arXiv Detail & Related papers (2021-03-06T09:14:30Z) - Sensorimotor representation learning for an "active self" in robots: A
model survey [10.649413494649293]
In humans, these capabilities are thought to be related to our ability to perceive our body in space.
This paper reviews the developmental processes of underlying mechanisms of these abilities.
We propose a theoretical computational framework, which aims to allow the emergence of the sense of self in artificial agents.
arXiv Detail & Related papers (2020-11-25T16:31:01Z) - Warmth and Competence to Predict Human Preference of Robot Behavior in
Physical Human-Robot Interaction [0.8594140167290099]
Social cognition posits that the dimensions Warmth and Competence are central and universal dimensions characterizing other humans.
The Robotic Social Attribute Scale (RoSAS) proposes items for those dimensions suitable for HRI and validated them in a visual observation study.
We found that Warmth and Competence, among all RoSAS and Godspeed dimensions, are the most important predictors for human preferences between different robot behaviors.
arXiv Detail & Related papers (2020-08-13T10:19:47Z) - Human Grasp Classification for Reactive Human-to-Robot Handovers [50.91803283297065]
We propose an approach for human-to-robot handovers in which the robot meets the human halfway.
We collect a human grasp dataset which covers typical ways of holding objects with various hand shapes and poses.
We present a planning and execution approach that takes the object from the human hand according to the detected grasp and hand position.
arXiv Detail & Related papers (2020-03-12T19:58:03Z) - Human Perception of Intrinsically Motivated Autonomy in Human-Robot
Interaction [2.485182034310304]
A challenge in using robots in human-inhabited environments is to design behavior that is engaging, yet robust to the perturbations induced by human interaction.
Our idea is to imbue the robot with intrinsic motivation (IM) so that it can handle new situations and appears as a genuine social other to humans.
This article presents a "robotologist" study design that allows comparing autonomously generated behaviors with each other.
arXiv Detail & Related papers (2020-02-14T09:49:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.