Semantic-Aware Environment Perception for Mobile Human-Robot Interaction
- URL: http://arxiv.org/abs/2211.03367v1
- Date: Mon, 7 Nov 2022 08:49:45 GMT
- Title: Semantic-Aware Environment Perception for Mobile Human-Robot Interaction
- Authors: Thorsten Hempel, Marc-Andr\'e Fiedler, Aly Khalifa, Ayoub Al-Hamadi,
Laslo Dinges
- Abstract summary: We present a vision-based system for mobile robots to enable a semantic-aware environment without additional a-priori knowledge.
We deploy our system on a mobile humanoid robot that enables us to test our methods in real-world applications.
- Score: 2.309914459672557
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Current technological advances open up new opportunities for bringing
human-machine interaction to a new level of human-centered cooperation. In this
context, a key issue is the semantic understanding of the environment in order
to enable mobile robots more complex interactions and a facilitated
communication with humans. Prerequisites are the vision-based registration of
semantic objects and humans, where the latter are further analyzed for
potential interaction partners. Despite significant research achievements, the
reliable and fast registration of semantic information still remains a
challenging task for mobile robots in real-world scenarios. In this paper, we
present a vision-based system for mobile assistive robots to enable a
semantic-aware environment perception without additional a-priori knowledge. We
deploy our system on a mobile humanoid robot that enables us to test our
methods in real-world applications.
Related papers
- Language-guided Robust Navigation for Mobile Robots in Dynamically-changing Environments [26.209402619114353]
We develop an embodied AI system for human-in-the-loop navigation with a wheeled mobile robot.
We propose a method of monitoring the robot's current plan to detect changes in the environment that impact the intended trajectory of the robot.
This work can support applications like precision agriculture and construction, where persistent monitoring of the environment provides a human with information about the environment state.
arXiv Detail & Related papers (2024-09-28T21:30:23Z) - Robot Interaction Behavior Generation based on Social Motion Forecasting for Human-Robot Interaction [9.806227900768926]
We propose to model social motion forecasting in a shared human-robot representation space.
ECHO operates in the aforementioned shared space to predict the future motions of the agents encountered in social scenarios.
We evaluate our model in multi-person and human-robot motion forecasting tasks and obtain state-of-the-art performance by a large margin.
arXiv Detail & Related papers (2024-02-07T11:37:14Z) - Habitat 3.0: A Co-Habitat for Humans, Avatars and Robots [119.55240471433302]
Habitat 3.0 is a simulation platform for studying collaborative human-robot tasks in home environments.
It addresses challenges in modeling complex deformable bodies and diversity in appearance and motion.
Human-in-the-loop infrastructure enables real human interaction with simulated robots via mouse/keyboard or a VR interface.
arXiv Detail & Related papers (2023-10-19T17:29:17Z) - HandMeThat: Human-Robot Communication in Physical and Social
Environments [73.91355172754717]
HandMeThat is a benchmark for a holistic evaluation of instruction understanding and following in physical and social environments.
HandMeThat contains 10,000 episodes of human-robot interactions.
We show that both offline and online reinforcement learning algorithms perform poorly on HandMeThat.
arXiv Detail & Related papers (2023-10-05T16:14:46Z) - Perception for Humanoid Robots [10.560498559084449]
This review summarizes the recent developments and trends in the field of perception in humanoid robots.
Three main areas of application are identified, namely, internal state estimation, external environment estimation, and human robot interaction.
arXiv Detail & Related papers (2023-09-27T12:32:11Z) - HERD: Continuous Human-to-Robot Evolution for Learning from Human
Demonstration [57.045140028275036]
We show that manipulation skills can be transferred from a human to a robot through the use of micro-evolutionary reinforcement learning.
We propose an algorithm for multi-dimensional evolution path searching that allows joint optimization of both the robot evolution path and the policy.
arXiv Detail & Related papers (2022-12-08T15:56:13Z) - Aligning Robot Representations with Humans [5.482532589225552]
Key question is how to best transfer knowledge learned in one environment to another, where shifting constraints and human preferences render adaptation challenging.
We postulate that because humans will be the ultimate evaluator of system success in the world, they are best suited to communicating the aspects of the tasks that matter to the robot.
We highlight three areas where we can use this approach to build interactive systems and offer future directions of work to better create advanced collaborative robots.
arXiv Detail & Related papers (2022-05-15T15:51:05Z) - Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and
Robotics Together [68.44697646919515]
This paper presents several human-robot systems that utilize spatial computing to enable novel robot use cases.
The combination of spatial computing and egocentric sensing on mixed reality devices enables them to capture and understand human actions and translate these to actions with spatial meaning.
arXiv Detail & Related papers (2022-02-03T10:04:26Z) - Semantics for Robotic Mapping, Perception and Interaction: A Survey [93.93587844202534]
Study of understanding dictates what does the world "mean" to a robot.
With humans and robots increasingly operating in the same world, the prospects of human-robot interaction also bring semantics into the picture.
Driven by need, as well as by enablers like increasing availability of training data and computational resources, semantics is a rapidly growing research area in robotics.
arXiv Detail & Related papers (2021-01-02T12:34:39Z) - SAPIEN: A SimulAted Part-based Interactive ENvironment [77.4739790629284]
SAPIEN is a realistic and physics-rich simulated environment that hosts a large-scale set for articulated objects.
We evaluate state-of-the-art vision algorithms for part detection and motion attribute recognition as well as demonstrate robotic interaction tasks.
arXiv Detail & Related papers (2020-03-19T00:11:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.