MimicGen: A Data Generation System for Scalable Robot Learning using
Human Demonstrations
- URL: http://arxiv.org/abs/2310.17596v1
- Date: Thu, 26 Oct 2023 17:17:31 GMT
- Title: MimicGen: A Data Generation System for Scalable Robot Learning using
Human Demonstrations
- Authors: Ajay Mandlekar, Soroush Nasiriany, Bowen Wen, Iretiayo Akinola,
Yashraj Narang, Linxi Fan, Yuke Zhu, Dieter Fox
- Abstract summary: MimicGen is a system for automatically synthesizing large-scale, rich datasets from only a small number of human demonstrations.
We show that robot agents can be effectively trained on this generated dataset by imitation learning to achieve strong performance in long-horizon and high-precision tasks.
- Score: 55.549956643032836
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Imitation learning from a large set of human demonstrations has proved to be
an effective paradigm for building capable robot agents. However, the
demonstrations can be extremely costly and time-consuming to collect. We
introduce MimicGen, a system for automatically synthesizing large-scale, rich
datasets from only a small number of human demonstrations by adapting them to
new contexts. We use MimicGen to generate over 50K demonstrations across 18
tasks with diverse scene configurations, object instances, and robot arms from
just ~200 human demonstrations. We show that robot agents can be effectively
trained on this generated dataset by imitation learning to achieve strong
performance in long-horizon and high-precision tasks, such as multi-part
assembly and coffee preparation, across broad initial state distributions. We
further demonstrate that the effectiveness and utility of MimicGen data compare
favorably to collecting additional human demonstrations, making it a powerful
and economical approach towards scaling up robot learning. Datasets, simulation
environments, videos, and more at https://mimicgen.github.io .
Related papers
- DexMimicGen: Automated Data Generation for Bimanual Dexterous Manipulation via Imitation Learning [42.88605563822155]
We present a large-scale automated data generation system that synthesizes trajectories from human demonstrations for humanoid robots with dexterous hands.
We generate 21K demos across these tasks from just 60 source human demos.
We also present a real-to-sim-to-real pipeline and deploy it on a real-world humanoid can sorting task.
arXiv Detail & Related papers (2024-10-31T17:48:45Z) - SkillMimicGen: Automated Demonstration Generation for Efficient Skill Learning and Deployment [33.53559296053225]
We propose SkillMimicGen, an automated system for generating demonstration datasets from a few human demos.
SkillGen segments human demos into manipulation skills, adapts these skills to new contexts, and stitches them together through free-space transit and transfer motion.
We demonstrate the efficacy of SkillGen by generating over 24K demonstrations across 18 task variants in simulation from just 60 human demonstrations.
arXiv Detail & Related papers (2024-10-24T16:59:26Z) - RoboCasa: Large-Scale Simulation of Everyday Tasks for Generalist Robots [25.650235551519952]
We present RoboCasa, a large-scale simulation framework for training generalist robots in everyday environments.
We provide thousands of 3D assets across over 150 object categories and dozens of interactable furniture and appliances.
Our experiments show a clear scaling trend in using synthetically generated robot data for large-scale imitation learning.
arXiv Detail & Related papers (2024-06-04T17:41:31Z) - DiffGen: Robot Demonstration Generation via Differentiable Physics Simulation, Differentiable Rendering, and Vision-Language Model [72.66465487508556]
DiffGen is a novel framework that integrates differentiable physics simulation, differentiable rendering, and a vision-language model.
It can generate realistic robot demonstrations by minimizing the distance between the embedding of the language instruction and the embedding of the simulated observation.
Experiments demonstrate that with DiffGen, we could efficiently and effectively generate robot data with minimal human effort or training time.
arXiv Detail & Related papers (2024-05-12T15:38:17Z) - AdaDemo: Data-Efficient Demonstration Expansion for Generalist Robotic Agent [75.91274222142079]
In this study, we aim to scale up demonstrations in a data-efficient way to facilitate the learning of generalist robotic agents.
AdaDemo is a framework designed to improve multi-task policy learning by actively and continually expanding the demonstration dataset.
arXiv Detail & Related papers (2024-04-11T01:59:29Z) - RoboGen: Towards Unleashing Infinite Data for Automated Robot Learning via Generative Simulation [68.70755196744533]
RoboGen is a generative robotic agent that automatically learns diverse robotic skills at scale via generative simulation.
Our work attempts to extract the extensive and versatile knowledge embedded in large-scale models and transfer them to the field of robotics.
arXiv Detail & Related papers (2023-11-02T17:59:21Z) - Visual Imitation Made Easy [102.36509665008732]
We present an alternate interface for imitation that simplifies the data collection process while allowing for easy transfer to robots.
We use commercially available reacher-grabber assistive tools both as a data collection device and as the robot's end-effector.
We experimentally evaluate on two challenging tasks: non-prehensile pushing and prehensile stacking, with 1000 diverse demonstrations for each task.
arXiv Detail & Related papers (2020-08-11T17:58:50Z) - Learning Predictive Models From Observation and Interaction [137.77887825854768]
Learning predictive models from interaction with the world allows an agent, such as a robot, to learn about how the world works.
However, learning a model that captures the dynamics of complex skills represents a major challenge.
We propose a method to augment the training set with observational data of other agents, such as humans.
arXiv Detail & Related papers (2019-12-30T01:10:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.