Graph learning in robotics: a survey
- URL: http://arxiv.org/abs/2310.04294v1
- Date: Fri, 6 Oct 2023 14:52:25 GMT
- Title: Graph learning in robotics: a survey
- Authors: Francesca Pistilli and Giuseppe Averta
- Abstract summary: The paper covers the fundamentals of graph-based models, including their architecture, training procedures, and applications.
It also discusses recent advancements and challenges that arise in applied settings.
The paper provides an extensive review of various robotic applications that benefit from learning on graph structures.
- Score: 2.5726566614123874
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Deep neural networks for graphs have emerged as a powerful tool for learning
on complex non-euclidean data, which is becoming increasingly common for a
variety of different applications. Yet, although their potential has been
widely recognised in the machine learning community, graph learning is largely
unexplored for downstream tasks such as robotics applications. To fully unlock
their potential, hence, we propose a review of graph neural architectures from
a robotics perspective. The paper covers the fundamentals of graph-based
models, including their architecture, training procedures, and applications. It
also discusses recent advancements and challenges that arise in applied
settings, related for example to the integration of perception,
decision-making, and control. Finally, the paper provides an extensive review
of various robotic applications that benefit from learning on graph structures,
such as bodies and contacts modelling, robotic manipulation, action
recognition, fleet motion planning, and many more. This survey aims to provide
readers with a thorough understanding of the capabilities and limitations of
graph neural architectures in robotics, and to highlight potential avenues for
future research.
Related papers
- A Survey of Embodied Learning for Object-Centric Robotic Manipulation [27.569063968870868]
Embodied learning for object-centric robotic manipulation is a rapidly developing and challenging area in AI.
Unlike data-driven machine learning methods, embodied learning focuses on robot learning through physical interaction with the environment.
arXiv Detail & Related papers (2024-08-21T11:32:09Z) - Machine Learning on Dynamic Graphs: A Survey on Applications [0.0]
We present a review of lesser-explored applications of dynamic graph learning.
This study revealed the potential of machine learning on dynamic graphs in addressing challenges across diverse domains.
arXiv Detail & Related papers (2024-01-16T06:40:24Z) - RH20T: A Comprehensive Robotic Dataset for Learning Diverse Skills in
One-Shot [56.130215236125224]
A key challenge in robotic manipulation in open domains is how to acquire diverse and generalizable skills for robots.
Recent research in one-shot imitation learning has shown promise in transferring trained policies to new tasks based on demonstrations.
This paper aims to unlock the potential for an agent to generalize to hundreds of real-world skills with multi-modal perception.
arXiv Detail & Related papers (2023-07-02T15:33:31Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - RT-1: Robotics Transformer for Real-World Control at Scale [98.09428483862165]
We present a model class, dubbed Robotics Transformer, that exhibits promising scalable model properties.
We verify our conclusions in a study of different model classes and their ability to generalize as a function of the data size, model size, and data diversity based on a large-scale data collection on real robots performing real-world tasks.
arXiv Detail & Related papers (2022-12-13T18:55:15Z) - OG-SGG: Ontology-Guided Scene Graph Generation. A Case Study in Transfer
Learning for Telepresence Robotics [124.08684545010664]
Scene graph generation from images is a task of great interest to applications such as robotics.
We propose an initial approximation to a framework called Ontology-Guided Scene Graph Generation (OG-SGG)
arXiv Detail & Related papers (2022-02-21T13:23:15Z) - Automated Graph Machine Learning: Approaches, Libraries, Benchmarks and Directions [58.220137936626315]
This paper extensively discusses automated graph machine learning approaches.
We introduce AutoGL, our dedicated and the world's first open-source library for automated graph machine learning.
Also, we describe a tailored benchmark that supports unified, reproducible, and efficient evaluations.
arXiv Detail & Related papers (2022-01-04T18:31:31Z) - An energy-based model for neuro-symbolic reasoning on knowledge graphs [0.0]
We propose an energy-based graph embedding algorithm to characterize industrial automation systems.
By combining knowledge from multiple domains, the learned model is capable of making context-aware predictions.
The presented model is mappable to a biologically-inspired neural architecture, serving as a first bridge between graph embedding methods and neuromorphic computing.
arXiv Detail & Related papers (2021-10-04T18:02:36Z) - Actionable Models: Unsupervised Offline Reinforcement Learning of
Robotic Skills [93.12417203541948]
We propose the objective of learning a functional understanding of the environment by learning to reach any goal state in a given dataset.
We find that our method can operate on high-dimensional camera images and learn a variety of skills on real robots that generalize to previously unseen scenes and objects.
arXiv Detail & Related papers (2021-04-15T20:10:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.