Natural Language Instructions for Intuitive Human Interaction with
Robotic Assistants in Field Construction Work
- URL: http://arxiv.org/abs/2307.04195v2
- Date: Tue, 11 Jul 2023 22:34:51 GMT
- Title: Natural Language Instructions for Intuitive Human Interaction with
Robotic Assistants in Field Construction Work
- Authors: Somin Park, Xi Wang, Carol C. Menassa, Vineet R. Kamat, Joyce Y. Chai
- Abstract summary: This paper proposes a framework to allow human workers to interact with construction robots based on natural language instructions.
The proposed method consists of three stages: Natural Language Understanding (NLU), Information Mapping (IM), and Robot Control (RC)
- Score: 4.223718588030052
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The introduction of robots is widely considered to have significant potential
of alleviating the issues of worker shortage and stagnant productivity that
afflict the construction industry. However, it is challenging to use fully
automated robots in complex and unstructured construction sites. Human-Robot
Collaboration (HRC) has shown promise of combining human workers' flexibility
and robot assistants' physical abilities to jointly address the uncertainties
inherent in construction work. When introducing HRC in construction, it is
critical to recognize the importance of teamwork and supervision in field
construction and establish a natural and intuitive communication system for the
human workers and robotic assistants. Natural language-based interaction can
enable intuitive and familiar communication with robots for human workers who
are non-experts in robot programming. However, limited research has been
conducted on this topic in construction. This paper proposes a framework to
allow human workers to interact with construction robots based on natural
language instructions. The proposed method consists of three stages: Natural
Language Understanding (NLU), Information Mapping (IM), and Robot Control (RC).
Natural language instructions are input to a language model to predict a tag
for each word in the NLU module. The IM module uses the result of the NLU
module and building component information to generate the final instructional
output essential for a robot to acknowledge and perform the construction task.
A case study for drywall installation is conducted to evaluate the proposed
approach. The obtained results highlight the potential of using natural
language-based interaction to replicate the communication that occurs between
human workers within the context of human-robot teams.
Related papers
- $π_0$: A Vision-Language-Action Flow Model for General Robot Control [77.32743739202543]
We propose a novel flow matching architecture built on top of a pre-trained vision-language model (VLM) to inherit Internet-scale semantic knowledge.
We evaluate our model in terms of its ability to perform tasks in zero shot after pre-training, follow language instructions from people, and its ability to acquire new skills via fine-tuning.
arXiv Detail & Related papers (2024-10-31T17:22:30Z) - HARMONIC: Cognitive and Control Collaboration in Human-Robotic Teams [0.0]
We demonstrate a cognitive strategy for robots in human-robot teams that incorporates metacognition, natural language communication, and explainability.
The system is embodied using the HARMONIC architecture that flexibly integrates cognitive and control capabilities.
arXiv Detail & Related papers (2024-09-26T16:48:21Z) - Towards Human-Centered Construction Robotics: A Reinforcement Learning-Driven Companion Robot for Contextually Assisting Carpentry Workers [11.843554918145983]
This paper introduces a human-centered approach with a "work companion rover" designed to assist construction workers within their existing practices.
We conduct an in-depth study on deploying a robotic system in carpentry formwork, showcasing a prototype that emphasizes mobility, safety, and comfortable worker-robot collaboration.
arXiv Detail & Related papers (2024-03-27T23:55:02Z) - RoboScript: Code Generation for Free-Form Manipulation Tasks across Real
and Simulation [77.41969287400977]
This paper presents textbfRobotScript, a platform for a deployable robot manipulation pipeline powered by code generation.
We also present a benchmark for a code generation benchmark for robot manipulation tasks in free-form natural language.
We demonstrate the adaptability of our code generation framework across multiple robot embodiments, including the Franka and UR5 robot arms.
arXiv Detail & Related papers (2024-02-22T15:12:00Z) - Exploring Large Language Models to Facilitate Variable Autonomy for Human-Robot Teaming [4.779196219827508]
We introduce a novel framework for a GPT-powered multi-robot testbed environment, based on a Unity Virtual Reality (VR) setting.
This system allows users to interact with robot agents through natural language, each powered by individual GPT cores.
A user study with 12 participants explores the effectiveness of GPT-4 and, more importantly, user strategies when being given the opportunity to converse in natural language within a multi-robot environment.
arXiv Detail & Related papers (2023-12-12T12:26:48Z) - A Human-Robot Mutual Learning System with Affect-Grounded Language
Acquisition and Differential Outcomes Training [0.1812164955222814]
The paper presents a novel human-robot interaction setup for identifying robot homeostatic needs.
We adopted a differential outcomes training protocol whereby the robot provides feedback specific to its internal needs.
We found evidence that DOT can enhance the human's learning efficiency, which in turn enables more efficient robot language acquisition.
arXiv Detail & Related papers (2023-10-20T09:41:31Z) - Improved Trust in Human-Robot Collaboration with ChatGPT [1.086544864007391]
The paper explores the impact of ChatGPT on trust in a human-robot collaboration assembly task.
A human-subject experiment showed that incorporating ChatGPT in robots significantly increased trust in human-robot collaboration.
The findings of this study have significant implications for the development of human-robot collaboration systems.
arXiv Detail & Related papers (2023-04-25T02:48:35Z) - "No, to the Right" -- Online Language Corrections for Robotic
Manipulation via Shared Autonomy [70.45420918526926]
We present LILAC, a framework for incorporating and adapting to natural language corrections online during execution.
Instead of discrete turn-taking between a human and robot, LILAC splits agency between the human and robot.
We show that our corrections-aware approach obtains higher task completion rates, and is subjectively preferred by users.
arXiv Detail & Related papers (2023-01-06T15:03:27Z) - Show Me What You Can Do: Capability Calibration on Reachable Workspace
for Human-Robot Collaboration [83.4081612443128]
We show that a short calibration using REMP can effectively bridge the gap between what a non-expert user thinks a robot can reach and the ground-truth.
We show that this calibration procedure not only results in better user perception, but also promotes more efficient human-robot collaborations.
arXiv Detail & Related papers (2021-03-06T09:14:30Z) - Self-supervised reinforcement learning for speaker localisation with the
iCub humanoid robot [58.2026611111328]
Looking at a person's face is one of the mechanisms that humans rely on when it comes to filtering speech in noisy environments.
Having a robot that can look toward a speaker could benefit ASR performance in challenging environments.
We propose a self-supervised reinforcement learning-based framework inspired by the early development of humans.
arXiv Detail & Related papers (2020-11-12T18:02:15Z) - Joint Mind Modeling for Explanation Generation in Complex Human-Robot
Collaborative Tasks [83.37025218216888]
We propose a novel explainable AI (XAI) framework for achieving human-like communication in human-robot collaborations.
The robot builds a hierarchical mind model of the human user and generates explanations of its own mind as a form of communications.
Results show that the generated explanations of our approach significantly improves the collaboration performance and user perception of the robot.
arXiv Detail & Related papers (2020-07-24T23:35:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.