Lyceum: An efficient and scalable ecosystem for robot learning
- URL: http://arxiv.org/abs/2001.07343v1
- Date: Tue, 21 Jan 2020 05:03:04 GMT
- Title: Lyceum: An efficient and scalable ecosystem for robot learning
- Authors: Colin Summers, Kendall Lowrey, Aravind Rajeswaran, Siddhartha
Srinivasa, Emanuel Todorov
- Abstract summary: Lyceum is a high-performance computational ecosystem for robot learning.
It is built on top of the Julia programming language and the MuJoCo physics simulator.
It is 5-30x faster than other popular abstractions like OpenAI's Gym and DeepMind's dm-control.
- Score: 11.859894139914754
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Lyceum, a high-performance computational ecosystem for robot
learning. Lyceum is built on top of the Julia programming language and the
MuJoCo physics simulator, combining the ease-of-use of a high-level programming
language with the performance of native C. In addition, Lyceum has a
straightforward API to support parallel computation across multiple cores and
machines. Overall, depending on the complexity of the environment, Lyceum is
5-30x faster compared to other popular abstractions like OpenAI's Gym and
DeepMind's dm-control. This substantially reduces training time for various
reinforcement learning algorithms; and is also fast enough to support real-time
model predictive control through MuJoCo. The code, tutorials, and demonstration
videos can be found at: www.lyceum.ml.
Related papers
- Enabling High-Sparsity Foundational Llama Models with Efficient Pretraining and Deployment [56.44025052765861]
Large language models (LLMs) have revolutionized Natural Language Processing (NLP), but their size creates computational bottlenecks.
We introduce a novel approach to create accurate, sparse foundational versions of performant LLMs.
We show a total speedup on CPUs for sparse-quantized LLaMA models of up to 8.6x.
arXiv Detail & Related papers (2024-05-06T16:03:32Z) - YAMLE: Yet Another Machine Learning Environment [4.985768723667417]
YAMLE is an open-source framework that facilitates rapid prototyping and experimentation with machine learning (ML) models and methods.
YAMLE includes a command-line interface and integrations with popular and well-maintained PyTorch-based libraries.
The ambition for YAMLE is to grow into a shared ecosystem where researchers and practitioners can quickly build on and compare existing implementations.
arXiv Detail & Related papers (2024-02-09T09:34:36Z) - JaxMARL: Multi-Agent RL Environments and Algorithms in JAX [105.343918678781]
We present JaxMARL, the first open-source, Python-based library that combines GPU-enabled efficiency with support for a large number of commonly used MARL environments.
Our experiments show that, in terms of wall clock time, our JAX-based training pipeline is around 14 times faster than existing approaches.
We also introduce and benchmark SMAX, a JAX-based approximate reimplementation of the popular StarCraft Multi-Agent Challenge.
arXiv Detail & Related papers (2023-11-16T18:58:43Z) - Accelerate Multi-Agent Reinforcement Learning in Zero-Sum Games with
Subgame Curriculum Learning [65.36326734799587]
We present a novel subgame curriculum learning framework for zero-sum games.
It adopts an adaptive initial state distribution by resetting agents to some previously visited states.
We derive a subgame selection metric that approximates the squared distance to NE values.
arXiv Detail & Related papers (2023-10-07T13:09:37Z) - CoLA: Exploiting Compositional Structure for Automatic and Efficient
Numerical Linear Algebra [62.37017125812101]
We propose a simple but general framework for large-scale linear algebra problems in machine learning, named CoLA.
By combining a linear operator abstraction with compositional dispatch rules, CoLA automatically constructs memory and runtime efficient numerical algorithms.
We showcase its efficacy across a broad range of applications, including partial differential equations, Gaussian processes, equivariant model construction, and unsupervised learning.
arXiv Detail & Related papers (2023-09-06T14:59:38Z) - CaiRL: A High-Performance Reinforcement Learning Environment Toolkit [9.432068833600884]
CaiRL Environment Toolkit is an efficient, compatible, and more sustainable alternative for training learning agents.
We demonstrate the effectiveness of CaiRL in the classic control benchmark, comparing the execution speed to OpenAI Gym.
arXiv Detail & Related papers (2022-10-03T21:24:04Z) - VRKitchen2.0-IndoorKit: A Tutorial for Augmented Indoor Scene Building
in Omniverse [77.52012928882928]
INDOORKIT is a built-in toolkit for NVIDIA OMNIVERSE.
It provides flexible pipelines for indoor scene building, scene randomizing, and animation controls.
arXiv Detail & Related papers (2022-06-23T17:53:33Z) - Lettuce: PyTorch-based Lattice Boltzmann Framework [0.0]
The lattice Boltzmann method (LBM) is an efficient simulation technique for computational fluid mechanics and beyond.
Here, we introduce Lettuce, a PyTorch-based LBM code with a threefold aim.
arXiv Detail & Related papers (2021-06-24T11:44:21Z) - Extending Python for Quantum-Classical Computing via Quantum
Just-in-Time Compilation [78.8942067357231]
Python is a popular programming language known for its flexibility, usability, readability, and focus on developer productivity.
We present a language extension to Python that enables heterogeneous quantum-classical computing via a robust C++ infrastructure for quantum just-in-time compilation.
arXiv Detail & Related papers (2021-05-10T21:11:21Z) - Accelerating GMRES with Deep Learning in Real-Time [0.0]
We show a real-time machine learning algorithm that can be used to accelerate the time-to-solution for GMRES.
Our framework is novel in that is integrates the deep learning algorithm in an in situ fashion.
arXiv Detail & Related papers (2021-03-19T18:21:38Z) - Julia Language in Machine Learning: Algorithms, Applications, and Open
Issues [5.666843255747851]
Machine learning is driving development across many fields in science and engineering.
Currently, the programming languages most commonly used to develop machine learning algorithms include Python, and C/C ++.
This paper summarizes the related research work and developments in the application of the Julia language in machine learning.
arXiv Detail & Related papers (2020-03-23T09:31:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.