Perception for Humanoid Robots
- URL: http://arxiv.org/abs/2309.15616v1
- Date: Wed, 27 Sep 2023 12:32:11 GMT
- Title: Perception for Humanoid Robots
- Authors: Arindam Roychoudhury, Shahram Khorshidi, Subham Agrawal, Maren
Bennewitz
- Abstract summary: This review summarizes the recent developments and trends in the field of perception in humanoid robots.
Three main areas of application are identified, namely, internal state estimation, external environment estimation, and human robot interaction.
- Score: 10.560498559084449
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Purpose of Review: The field of humanoid robotics, perception plays a
fundamental role in enabling robots to interact seamlessly with humans and
their surroundings, leading to improved safety, efficiency, and user
experience. This scientific study investigates various perception modalities
and techniques employed in humanoid robots, including visual, auditory, and
tactile sensing by exploring recent state-of-the-art approaches for perceiving
and understanding the internal state, the environment, objects, and human
activities.
Recent Findings: Internal state estimation makes extensive use of Bayesian
filtering methods and optimization techniques based on maximum a-posteriori
formulation by utilizing proprioceptive sensing. In the area of external
environment understanding, with an emphasis on robustness and adaptability to
dynamic, unforeseen environmental changes, the new slew of research discussed
in this study have focused largely on multi-sensor fusion and machine learning
in contrast to the use of hand-crafted, rule-based systems. Human robot
interaction methods have established the importance of contextual information
representation and memory for understanding human intentions.
Summary: This review summarizes the recent developments and trends in the
field of perception in humanoid robots. Three main areas of application are
identified, namely, internal state estimation, external environment estimation,
and human robot interaction. The applications of diverse sensor modalities in
each of these areas are considered and recent significant works are discussed.
Related papers
- Improving Visual Perception of a Social Robot for Controlled and
In-the-wild Human-robot Interaction [10.260966795508569]
It is unclear how will the objective interaction performance and subjective user experience be influenced when a social robot adopts a deep-learning based visual perception model.
We employ state-of-the-art human perception and tracking models to improve the visual perception function of the Pepper robot.
arXiv Detail & Related papers (2024-03-04T06:47:06Z) - Real-time Addressee Estimation: Deployment of a Deep-Learning Model on
the iCub Robot [52.277579221741746]
Addressee Estimation is a skill essential for social robots to interact smoothly with humans.
Inspired by human perceptual skills, a deep-learning model for Addressee Estimation is designed, trained, and deployed on an iCub robot.
The study presents the procedure of such implementation and the performance of the model deployed in real-time human-robot interaction.
arXiv Detail & Related papers (2023-11-09T13:01:21Z) - Embodied Agents for Efficient Exploration and Smart Scene Description [47.82947878753809]
We tackle a setting for visual navigation in which an autonomous agent needs to explore and map an unseen indoor environment.
We propose and evaluate an approach that combines recent advances in visual robotic exploration and image captioning on images.
Our approach can generate smart scene descriptions that maximize semantic knowledge of the environment and avoid repetitions.
arXiv Detail & Related papers (2023-01-17T19:28:01Z) - Semantic-Aware Environment Perception for Mobile Human-Robot Interaction [2.309914459672557]
We present a vision-based system for mobile robots to enable a semantic-aware environment without additional a-priori knowledge.
We deploy our system on a mobile humanoid robot that enables us to test our methods in real-world applications.
arXiv Detail & Related papers (2022-11-07T08:49:45Z) - Causal Discovery of Dynamic Models for Predicting Human Spatial
Interactions [5.742409080817885]
We propose an application of causal discovery methods to model human-robot spatial interactions.
New methods and practical solutions are discussed to exploit, for the first time, a state-of-the-art causal discovery algorithm.
arXiv Detail & Related papers (2022-10-29T08:56:48Z) - Data-driven emotional body language generation for social robotics [58.88028813371423]
In social robotics, endowing humanoid robots with the ability to generate bodily expressions of affect can improve human-robot interaction and collaboration.
We implement a deep learning data-driven framework that learns from a few hand-designed robotic bodily expressions.
The evaluation study found that the anthropomorphism and animacy of the generated expressions are not perceived differently from the hand-designed ones.
arXiv Detail & Related papers (2022-05-02T09:21:39Z) - Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and
Robotics Together [68.44697646919515]
This paper presents several human-robot systems that utilize spatial computing to enable novel robot use cases.
The combination of spatial computing and egocentric sensing on mixed reality devices enables them to capture and understand human actions and translate these to actions with spatial meaning.
arXiv Detail & Related papers (2022-02-03T10:04:26Z) - Human-Robot Collaboration and Machine Learning: A Systematic Review of
Recent Research [69.48907856390834]
Human-robot collaboration (HRC) is the approach that explores the interaction between a human and a robot.
This paper proposes a thorough literature review of the use of machine learning techniques in the context of HRC.
arXiv Detail & Related papers (2021-10-14T15:14:33Z) - Semantics for Robotic Mapping, Perception and Interaction: A Survey [93.93587844202534]
Study of understanding dictates what does the world "mean" to a robot.
With humans and robots increasingly operating in the same world, the prospects of human-robot interaction also bring semantics into the picture.
Driven by need, as well as by enablers like increasing availability of training data and computational resources, semantics is a rapidly growing research area in robotics.
arXiv Detail & Related papers (2021-01-02T12:34:39Z) - Towards hybrid primary intersubjectivity: a neural robotics library for
human science [4.232614032390374]
We study primary intersubjectivity as a second person perspective experience characterized by predictive engagement.
We propose an open-source methodology named textitneural robotics library (NRL) for experimental human-robot interaction.
We discuss some ways human-robot (hybrid) intersubjectivity can contribute to human science research.
arXiv Detail & Related papers (2020-06-29T11:35:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.