Advances in Transformers for Robotic Applications: A Review
- URL: http://arxiv.org/abs/2412.10599v1
- Date: Fri, 13 Dec 2024 23:02:15 GMT
- Title: Advances in Transformers for Robotic Applications: A Review
- Authors: Nikunj Sanghai, Nik Bear Brown,
- Abstract summary: We go through recent advances and trends in Transformers in Robotics.
We examine their integration into robotic perception, planning, and control for autonomous systems.
We discuss how different Transformer variants are being adapted in robotics for reliable planning and perception.
- Score: 0.9208007322096533
- License:
- Abstract: The introduction of Transformers architecture has brought about significant breakthroughs in Deep Learning (DL), particularly within Natural Language Processing (NLP). Since their inception, Transformers have outperformed many traditional neural network architectures due to their "self-attention" mechanism and their scalability across various applications. In this paper, we cover the use of Transformers in Robotics. We go through recent advances and trends in Transformer architectures and examine their integration into robotic perception, planning, and control for autonomous systems. Furthermore, we review past work and recent research on use of Transformers in Robotics as pre-trained foundation models and integration of Transformers with Deep Reinforcement Learning (DRL) for autonomous systems. We discuss how different Transformer variants are being adapted in robotics for reliable planning and perception, increasing human-robot interaction, long-horizon decision-making, and generalization. Finally, we address limitations and challenges, offering insight and suggestions for future research directions.
Related papers
- Solving Multi-Goal Robotic Tasks with Decision Transformer [0.0]
We introduce a novel adaptation of the decision transformer architecture for offline multi-goal reinforcement learning in robotics.
Our approach integrates goal-specific information into the decision transformer, allowing it to handle complex tasks in an offline setting.
arXiv Detail & Related papers (2024-10-08T20:35:30Z) - Body Transformer: Leveraging Robot Embodiment for Policy Learning [51.531793239586165]
Body Transformer (BoT) is an architecture that leverages the robot embodiment by providing an inductive bias that guides the learning process.
We represent the robot body as a graph of sensors and actuators, and rely on masked attention to pool information throughout the architecture.
The resulting architecture outperforms the vanilla transformer, as well as the classical multilayer perceptron, in terms of task completion, scaling properties, and computational efficiency.
arXiv Detail & Related papers (2024-08-12T17:31:28Z) - The Progression of Transformers from Language to Vision to MOT: A Literature Review on Multi-Object Tracking with Transformers [0.0]
transformer neural network architecture allows for autoregressive sequence-to-sequence modeling.
Transformers have also been applied across a wide variety of pattern recognition tasks, particularly in computer vision.
arXiv Detail & Related papers (2024-06-24T16:45:28Z) - RoboScript: Code Generation for Free-Form Manipulation Tasks across Real
and Simulation [77.41969287400977]
This paper presents textbfRobotScript, a platform for a deployable robot manipulation pipeline powered by code generation.
We also present a benchmark for a code generation benchmark for robot manipulation tasks in free-form natural language.
We demonstrate the adaptability of our code generation framework across multiple robot embodiments, including the Franka and UR5 robot arms.
arXiv Detail & Related papers (2024-02-22T15:12:00Z) - SARA-RT: Scaling up Robotics Transformers with Self-Adaptive Robust
Attention [34.276999297736346]
Self-Adaptive Robust Attention for Robotics Transformers (SARA-RT)
New paradigm for addressing the emerging challenge of scaling up Robotics Transformers (RT) for on-robot deployment.
arXiv Detail & Related papers (2023-12-04T16:08:47Z) - Introduction to Transformers: an NLP Perspective [59.0241868728732]
We introduce basic concepts of Transformers and present key techniques that form the recent advances of these models.
This includes a description of the standard Transformer architecture, a series of model refinements, and common applications.
arXiv Detail & Related papers (2023-11-29T13:51:04Z) - A Survey on Transformers in Reinforcement Learning [66.23773284875843]
Transformer has been considered the dominating neural architecture in NLP and CV, mostly under supervised settings.
Recently, a similar surge of using Transformers has appeared in the domain of reinforcement learning (RL), but it is faced with unique design choices and challenges brought by the nature of RL.
This paper systematically reviews motivations and progress on using Transformers in RL, provide a taxonomy on existing works, discuss each sub-field, and summarize future prospects.
arXiv Detail & Related papers (2023-01-08T14:04:26Z) - MetaMorph: Learning Universal Controllers with Transformers [45.478223199658785]
In robotics we primarily train a single robot for a single task.
modular robot systems now allow for the flexible combination of general-purpose building blocks into task optimized morphologies.
We propose MetaMorph, a Transformer based approach to learn a universal controller over a modular robot design space.
arXiv Detail & Related papers (2022-03-22T17:58:31Z) - Transformers in Vision: A Survey [101.07348618962111]
Transformers enable modeling long dependencies between input sequence elements and support parallel processing of sequence.
Transformers require minimal inductive biases for their design and are naturally suited as set-functions.
This survey aims to provide a comprehensive overview of the Transformer models in the computer vision discipline.
arXiv Detail & Related papers (2021-01-04T18:57:24Z) - A Survey on Visual Transformer [126.56860258176324]
Transformer is a type of deep neural network mainly based on the self-attention mechanism.
In this paper, we review these vision transformer models by categorizing them in different tasks and analyzing their advantages and disadvantages.
arXiv Detail & Related papers (2020-12-23T09:37:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.