Factory: Fast Contact for Robotic Assembly
- URL: http://arxiv.org/abs/2205.03532v1
- Date: Sat, 7 May 2022 03:27:30 GMT
- Title: Factory: Fast Contact for Robotic Assembly
- Authors: Yashraj Narang, Kier Storey, Iretiayo Akinola, Miles Macklin, Philipp
Reist, Lukasz Wawrzyniak, Yunrong Guo, Adam Moravanszky, Gavriel State,
Michelle Lu, Ankur Handa, Dieter Fox
- Abstract summary: Factory is a set of physics simulation methods and robot learning tools.
We achieve real-time or faster simulation of a wide range of contact-rich scenes.
We provide $60$ carefully-designed part models, 3 robotic assembly environments, and 7 robot controllers for training and testing virtual robots.
- Score: 29.948128168543114
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Robotic assembly is one of the oldest and most challenging applications of
robotics. In other areas of robotics, such as perception and grasping,
simulation has rapidly accelerated research progress, particularly when
combined with modern deep learning. However, accurately, efficiently, and
robustly simulating the range of contact-rich interactions in assembly remains
a longstanding challenge. In this work, we present Factory, a set of physics
simulation methods and robot learning tools for such applications. We achieve
real-time or faster simulation of a wide range of contact-rich scenes,
including simultaneous simulation of 1000 nut-and-bolt interactions. We provide
$60$ carefully-designed part models, 3 robotic assembly environments, and 7
robot controllers for training and testing virtual robots. Finally, we train
and evaluate proof-of-concept reinforcement learning policies for nut-and-bolt
assembly. We aim for Factory to open the doors to using simulation for robotic
assembly, as well as many other contact-rich applications in robotics. Please
see https://sites.google.com/nvidia.com/factory for supplementary content,
including videos.
Related papers
- Unifying 3D Representation and Control of Diverse Robots with a Single Camera [48.279199537720714]
We introduce Neural Jacobian Fields, an architecture that autonomously learns to model and control robots from vision alone.
Our approach achieves accurate closed-loop control and recovers the causal dynamic structure of each robot.
arXiv Detail & Related papers (2024-07-11T17:55:49Z) - RoboCasa: Large-Scale Simulation of Everyday Tasks for Generalist Robots [25.650235551519952]
We present RoboCasa, a large-scale simulation framework for training generalist robots in everyday environments.
We provide thousands of 3D assets across over 150 object categories and dozens of interactable furniture and appliances.
Our experiments show a clear scaling trend in using synthetically generated robot data for large-scale imitation learning.
arXiv Detail & Related papers (2024-06-04T17:41:31Z) - RoboScript: Code Generation for Free-Form Manipulation Tasks across Real
and Simulation [77.41969287400977]
This paper presents textbfRobotScript, a platform for a deployable robot manipulation pipeline powered by code generation.
We also present a benchmark for a code generation benchmark for robot manipulation tasks in free-form natural language.
We demonstrate the adaptability of our code generation framework across multiple robot embodiments, including the Franka and UR5 robot arms.
arXiv Detail & Related papers (2024-02-22T15:12:00Z) - Dynamic Handover: Throw and Catch with Bimanual Hands [30.206469112964033]
We design a system with two multi-finger hands attached to robot arms to solve this problem.
We train our system using Multi-Agent Reinforcement Learning in simulation and perform Sim2Real transfer to deploy on the real robots.
To overcome the Sim2Real gap, we provide multiple novel algorithm designs including learning a trajectory prediction model for the object.
arXiv Detail & Related papers (2023-09-11T17:49:25Z) - HomeRobot: Open-Vocabulary Mobile Manipulation [107.05702777141178]
Open-Vocabulary Mobile Manipulation (OVMM) is the problem of picking any object in any unseen environment, and placing it in a commanded location.
HomeRobot has two components: a simulation component, which uses a large and diverse curated object set in new, high-quality multi-room home environments; and a real-world component, providing a software stack for the low-cost Hello Robot Stretch.
arXiv Detail & Related papers (2023-06-20T14:30:32Z) - ROS-PyBullet Interface: A Framework for Reliable Contact Simulation and
Human-Robot Interaction [17.093672006793984]
We present the ROS-PyBullet Interface, a framework that provides a bridge between the reliable contact/impact simulator PyBullet and the Robot Operating System (ROS)
Furthermore, we provide additional utilities for facilitating Human-Robot Interaction (HRI) in the simulated environment.
arXiv Detail & Related papers (2022-10-13T10:31:36Z) - GenLoco: Generalized Locomotion Controllers for Quadrupedal Robots [87.32145104894754]
We introduce a framework for training generalized locomotion (GenLoco) controllers for quadrupedal robots.
Our framework synthesizes general-purpose locomotion controllers that can be deployed on a large variety of quadrupedal robots.
We show that our models acquire more general control strategies that can be directly transferred to novel simulated and real-world robots.
arXiv Detail & Related papers (2022-09-12T15:14:32Z) - V-MAO: Generative Modeling for Multi-Arm Manipulation of Articulated
Objects [51.79035249464852]
We present a framework for learning multi-arm manipulation of articulated objects.
Our framework includes a variational generative model that learns contact point distribution over object rigid parts for each robot arm.
arXiv Detail & Related papers (2021-11-07T02:31:09Z) - robo-gym -- An Open Source Toolkit for Distributed Deep Reinforcement
Learning on Real and Simulated Robots [0.5161531917413708]
We propose an open source toolkit: robo-gym to increase the use of Deep Reinforcement Learning with real robots.
We demonstrate a unified setup for simulation and real environments which enables a seamless transfer from training in simulation to application on the robot.
We showcase the capabilities and the effectiveness of the framework with two real world applications featuring industrial robots.
arXiv Detail & Related papers (2020-07-06T13:51:33Z) - SAPIEN: A SimulAted Part-based Interactive ENvironment [77.4739790629284]
SAPIEN is a realistic and physics-rich simulated environment that hosts a large-scale set for articulated objects.
We evaluate state-of-the-art vision algorithms for part detection and motion attribute recognition as well as demonstrate robotic interaction tasks.
arXiv Detail & Related papers (2020-03-19T00:11:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.