A novel integrated industrial approach with cobots in the age of
industry 4.0 through conversational interaction and computer vision
- URL: http://arxiv.org/abs/2402.10553v1
- Date: Fri, 16 Feb 2024 10:35:01 GMT
- Title: A novel integrated industrial approach with cobots in the age of
industry 4.0 through conversational interaction and computer vision
- Authors: Andrea Pazienza and Nicola Macchiarulo and Felice Vitulano and Antonio
Fiorentini and Marco Cammisa and Leonardo Rigutini and Ernesto Di Iorio and
Achille Globo and Antonio Trevisi
- Abstract summary: From robots that replace workers to robots that serve as helpful colleagues, the field of robotic automation is experiencing a new trend.
From robots that replace workers to robots that serve as helpful colleagues, the field of robotic automation is experiencing a new trend.
- Score: 1.2848575793946582
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: From robots that replace workers to robots that serve as helpful colleagues,
the field of robotic automation is experiencing a new trend that represents a
huge challenge for component manufacturers. The contribution starts from an
innovative vision that sees an ever closer collaboration between Cobot, able to
do a specific physical job with precision, the AI world, able to analyze
information and support the decision-making process, and the man able to have a
strategic vision of the future.
Related papers
- $π_0$: A Vision-Language-Action Flow Model for General Robot Control [77.32743739202543]
We propose a novel flow matching architecture built on top of a pre-trained vision-language model (VLM) to inherit Internet-scale semantic knowledge.
We evaluate our model in terms of its ability to perform tasks in zero shot after pre-training, follow language instructions from people, and its ability to acquire new skills via fine-tuning.
arXiv Detail & Related papers (2024-10-31T17:22:30Z) - HARMONIC: Cognitive and Control Collaboration in Human-Robotic Teams [0.0]
We demonstrate a cognitive strategy for robots in human-robot teams that incorporates metacognition, natural language communication, and explainability.
The system is embodied using the HARMONIC architecture that flexibly integrates cognitive and control capabilities.
arXiv Detail & Related papers (2024-09-26T16:48:21Z) - Metarobotics for Industry and Society: Vision, Technologies, and Opportunities [0.0]
Metarobotics aims to combine next generation wireless communication, multi-sense immersion, and collective intelligence.
Students enrolled in robotics courses will be taught under authentic industrial conditions in real-time.
Potentials for self-determination, self-efficacy, and work-life-flexibility in robotics-related applications in Society 5.0, Industry 4.0, and Industry 5.0 are outlined.
arXiv Detail & Related papers (2024-03-31T20:59:58Z) - Extended Reality for Enhanced Human-Robot Collaboration: a Human-in-the-Loop Approach [2.336967926255341]
Human-robot collaboration attempts to tackle these challenges by combining the strength and precision of machines with human ingenuity and perceptual understanding.
We propose an implementation framework for an autonomous, machine learning-based manipulator that incorporates human-in-the-loop principles.
The conceptual framework foresees human involvement directly in the robot learning process, resulting in higher adaptability and task generalization.
arXiv Detail & Related papers (2024-03-21T17:50:22Z) - Giving Robots a Hand: Learning Generalizable Manipulation with
Eye-in-Hand Human Video Demonstrations [66.47064743686953]
Eye-in-hand cameras have shown promise in enabling greater sample efficiency and generalization in vision-based robotic manipulation.
Videos of humans performing tasks, on the other hand, are much cheaper to collect since they eliminate the need for expertise in robotic teleoperation.
In this work, we augment narrow robotic imitation datasets with broad unlabeled human video demonstrations to greatly enhance the generalization of eye-in-hand visuomotor policies.
arXiv Detail & Related papers (2023-07-12T07:04:53Z) - Human in the AI loop via xAI and Active Learning for Visual Inspection [2.261815118231329]
Industrial revolutions have disrupted manufacturing by introducing automation into production.
Advances in robotics and artificial intelligence open new frontiers of human-machine collaboration.
The present work first describes Industry 5.0, human-machine collaboration, and state-of-the-art regarding quality inspection.
arXiv Detail & Related papers (2023-07-03T17:23:23Z) - HERD: Continuous Human-to-Robot Evolution for Learning from Human
Demonstration [57.045140028275036]
We show that manipulation skills can be transferred from a human to a robot through the use of micro-evolutionary reinforcement learning.
We propose an algorithm for multi-dimensional evolution path searching that allows joint optimization of both the robot evolution path and the policy.
arXiv Detail & Related papers (2022-12-08T15:56:13Z) - Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and
Robotics Together [68.44697646919515]
This paper presents several human-robot systems that utilize spatial computing to enable novel robot use cases.
The combination of spatial computing and egocentric sensing on mixed reality devices enables them to capture and understand human actions and translate these to actions with spatial meaning.
arXiv Detail & Related papers (2022-02-03T10:04:26Z) - An Embarrassingly Pragmatic Introduction to Vision-based Autonomous
Robots [0.0]
We develop a small-scale autonomous vehicle capable of understanding the scene using only visual information.
We discuss the current state of Robotics and autonomous driving and the technological and ethical limitations that we can find in this field.
arXiv Detail & Related papers (2021-11-15T01:31:28Z) - Show Me What You Can Do: Capability Calibration on Reachable Workspace
for Human-Robot Collaboration [83.4081612443128]
We show that a short calibration using REMP can effectively bridge the gap between what a non-expert user thinks a robot can reach and the ground-truth.
We show that this calibration procedure not only results in better user perception, but also promotes more efficient human-robot collaborations.
arXiv Detail & Related papers (2021-03-06T09:14:30Z) - Joint Mind Modeling for Explanation Generation in Complex Human-Robot
Collaborative Tasks [83.37025218216888]
We propose a novel explainable AI (XAI) framework for achieving human-like communication in human-robot collaborations.
The robot builds a hierarchical mind model of the human user and generates explanations of its own mind as a form of communications.
Results show that the generated explanations of our approach significantly improves the collaboration performance and user perception of the robot.
arXiv Detail & Related papers (2020-07-24T23:35:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.