ROS-PyBullet Interface: A Framework for Reliable Contact Simulation and
Human-Robot Interaction
- URL: http://arxiv.org/abs/2210.06887v1
- Date: Thu, 13 Oct 2022 10:31:36 GMT
- Title: ROS-PyBullet Interface: A Framework for Reliable Contact Simulation and
Human-Robot Interaction
- Authors: Christopher E. Mower, Theodoros Stouraitis, Jo\~ao Moura, Christian
Rauch, Lei Yan, Nazanin Zamani Behabadi, Michael Gienger, Tom Vercauteren,
Christos Bergeles, Sethu Vijayakumar
- Abstract summary: We present the ROS-PyBullet Interface, a framework that provides a bridge between the reliable contact/impact simulator PyBullet and the Robot Operating System (ROS)
Furthermore, we provide additional utilities for facilitating Human-Robot Interaction (HRI) in the simulated environment.
- Score: 17.093672006793984
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Reliable contact simulation plays a key role in the development of
(semi-)autonomous robots, especially when dealing with contact-rich
manipulation scenarios, an active robotics research topic. Besides simulation,
components such as sensing, perception, data collection, robot hardware
control, human interfaces, etc. are all key enablers towards applying machine
learning algorithms or model-based approaches in real world systems. However,
there is a lack of software connecting reliable contact simulation with the
larger robotics ecosystem (i.e. ROS, Orocos), for a more seamless application
of novel approaches, found in the literature, to existing robotic hardware. In
this paper, we present the ROS-PyBullet Interface, a framework that provides a
bridge between the reliable contact/impact simulator PyBullet and the Robot
Operating System (ROS). Furthermore, we provide additional utilities for
facilitating Human-Robot Interaction (HRI) in the simulated environment. We
also present several use-cases that highlight the capabilities and usefulness
of our framework. Please check our video, source code, and examples included in
the supplementary material. Our full code base is open source and can be found
at https://github.com/cmower/ros_pybullet_interface.
Related papers
- Polaris: Open-ended Interactive Robotic Manipulation via Syn2Real Visual Grounding and Large Language Models [53.22792173053473]
We introduce an interactive robotic manipulation framework called Polaris.
Polaris integrates perception and interaction by utilizing GPT-4 alongside grounded vision models.
We propose a novel Synthetic-to-Real (Syn2Real) pose estimation pipeline.
arXiv Detail & Related papers (2024-08-15T06:40:38Z) - ROS-LLM: A ROS framework for embodied AI with task feedback and structured reasoning [74.58666091522198]
We present a framework for intuitive robot programming by non-experts.
We leverage natural language prompts and contextual information from the Robot Operating System (ROS)
Our system integrates large language models (LLMs), enabling non-experts to articulate task requirements to the system through a chat interface.
arXiv Detail & Related papers (2024-06-28T08:28:38Z) - RoboScript: Code Generation for Free-Form Manipulation Tasks across Real
and Simulation [77.41969287400977]
This paper presents textbfRobotScript, a platform for a deployable robot manipulation pipeline powered by code generation.
We also present a benchmark for a code generation benchmark for robot manipulation tasks in free-form natural language.
We demonstrate the adaptability of our code generation framework across multiple robot embodiments, including the Franka and UR5 robot arms.
arXiv Detail & Related papers (2024-02-22T15:12:00Z) - HomeRobot: Open-Vocabulary Mobile Manipulation [107.05702777141178]
Open-Vocabulary Mobile Manipulation (OVMM) is the problem of picking any object in any unseen environment, and placing it in a commanded location.
HomeRobot has two components: a simulation component, which uses a large and diverse curated object set in new, high-quality multi-room home environments; and a real-world component, providing a software stack for the low-cost Hello Robot Stretch.
arXiv Detail & Related papers (2023-06-20T14:30:32Z) - Factory: Fast Contact for Robotic Assembly [29.948128168543114]
Factory is a set of physics simulation methods and robot learning tools.
We achieve real-time or faster simulation of a wide range of contact-rich scenes.
We provide $60$ carefully-designed part models, 3 robotic assembly environments, and 7 robot controllers for training and testing virtual robots.
arXiv Detail & Related papers (2022-05-07T03:27:30Z) - Model Predictive Control for Fluid Human-to-Robot Handovers [50.72520769938633]
Planning motions that take human comfort into account is not a part of the human-robot handover process.
We propose to generate smooth motions via an efficient model-predictive control framework.
We conduct human-to-robot handover experiments on a diverse set of objects with several users.
arXiv Detail & Related papers (2022-03-31T23:08:20Z) - Open-VICO: An Open-Source Gazebo Toolkit for Multi-Camera-based Skeleton
Tracking in Human-Robot Collaboration [0.0]
This work presents Open-VICO, an open-source toolkit to integrate virtual human models in Gazebo.
In particular, Open-VICO allows to combine in the same simulation environment realistic human kinematic models, multi-camera vision setups, and human-tracking techniques.
arXiv Detail & Related papers (2022-03-28T13:21:32Z) - A Framework for Learning Predator-prey Agents from Simulation to Real
World [0.0]
We propose an evolutionary predatorprey robot system which can be generally implemented from simulation to the real world.
Both the predators and prey are co-evolved by NeuroEvolution of Augmenting Topologies (NEAT) to learn the expected behaviours.
For the convenience of users, the source code and videos of the simulated and real world are published on Github.
arXiv Detail & Related papers (2020-10-29T17:33:38Z) - robo-gym -- An Open Source Toolkit for Distributed Deep Reinforcement
Learning on Real and Simulated Robots [0.5161531917413708]
We propose an open source toolkit: robo-gym to increase the use of Deep Reinforcement Learning with real robots.
We demonstrate a unified setup for simulation and real environments which enables a seamless transfer from training in simulation to application on the robot.
We showcase the capabilities and the effectiveness of the framework with two real world applications featuring industrial robots.
arXiv Detail & Related papers (2020-07-06T13:51:33Z) - SAPIEN: A SimulAted Part-based Interactive ENvironment [77.4739790629284]
SAPIEN is a realistic and physics-rich simulated environment that hosts a large-scale set for articulated objects.
We evaluate state-of-the-art vision algorithms for part detection and motion attribute recognition as well as demonstrate robotic interaction tasks.
arXiv Detail & Related papers (2020-03-19T00:11:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.