Amplifying robotics capacities with a human touch: An immersive
low-latency panoramic remote system
- URL: http://arxiv.org/abs/2401.03398v2
- Date: Tue, 9 Jan 2024 04:09:56 GMT
- Title: Amplifying robotics capacities with a human touch: An immersive
low-latency panoramic remote system
- Authors: Junjie Li, Kang Li, Dewei Han, Jian Xu and Zhaoyuan Ma
- Abstract summary: "Avatar" system is an immersive low-latency panoramic human-robot interaction platform.
Under favorable network conditions, we achieved a low-latency high-definition panoramic visual experience with a delay of 357ms.
The system enables remote control over vast physical distances, spanning campuses, provinces, countries, and even continents.
- Score: 16.97496024217201
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: AI and robotics technologies have witnessed remarkable advancements in the
past decade, revolutionizing work patterns and opportunities in various
domains. The application of these technologies has propelled society towards an
era of symbiosis between humans and machines. To facilitate efficient
communication between humans and intelligent robots, we propose the "Avatar"
system, an immersive low-latency panoramic human-robot interaction platform. We
have designed and tested a prototype of a rugged mobile platform integrated
with edge computing units, panoramic video capture devices, power batteries,
robot arms, and network communication equipment. Under favorable network
conditions, we achieved a low-latency high-definition panoramic visual
experience with a delay of 357ms. Operators can utilize VR headsets and
controllers for real-time immersive control of robots and devices. The system
enables remote control over vast physical distances, spanning campuses,
provinces, countries, and even continents (New York to Shenzhen). Additionally,
the system incorporates visual SLAM technology for map and trajectory
recording, providing autonomous navigation capabilities. We believe that this
intuitive system platform can enhance efficiency and situational experience in
human-robot collaboration, and with further advancements in related
technologies, it will become a versatile tool for efficient and symbiotic
cooperation between AI and humans.
Related papers
- Unifying 3D Representation and Control of Diverse Robots with a Single Camera [48.279199537720714]
We introduce Neural Jacobian Fields, an architecture that autonomously learns to model and control robots from vision alone.
Our approach achieves accurate closed-loop control and recovers the causal dynamic structure of each robot.
arXiv Detail & Related papers (2024-07-11T17:55:49Z) - Open-TeleVision: Teleoperation with Immersive Active Visual Feedback [17.505318269362512]
Open-TeleVision allows operators to actively perceive the robot's surroundings in a stereoscopic manner.
The system mirrors the operator's arm and hand movements on the robot, creating an immersive experience.
We validate the effectiveness of our system by collecting data and training imitation learning policies on four long-horizon, precise tasks.
arXiv Detail & Related papers (2024-07-01T17:55:35Z) - Giving Robots a Hand: Learning Generalizable Manipulation with
Eye-in-Hand Human Video Demonstrations [66.47064743686953]
Eye-in-hand cameras have shown promise in enabling greater sample efficiency and generalization in vision-based robotic manipulation.
Videos of humans performing tasks, on the other hand, are much cheaper to collect since they eliminate the need for expertise in robotic teleoperation.
In this work, we augment narrow robotic imitation datasets with broad unlabeled human video demonstrations to greatly enhance the generalization of eye-in-hand visuomotor policies.
arXiv Detail & Related papers (2023-07-12T07:04:53Z) - AnyTeleop: A General Vision-Based Dexterous Robot Arm-Hand Teleoperation System [51.48191418148764]
Vision-based teleoperation can endow robots with human-level intelligence to interact with the environment.
Current vision-based teleoperation systems are designed and engineered towards a particular robot model and deploy environment.
We propose AnyTeleop, a unified and general teleoperation system to support multiple different arms, hands, realities, and camera configurations within a single system.
arXiv Detail & Related papers (2023-07-10T14:11:07Z) - See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation [49.925499720323806]
We study how visual, auditory, and tactile perception can jointly help robots to solve complex manipulation tasks.
We build a robot system that can see with a camera, hear with a contact microphone, and feel with a vision-based tactile sensor.
arXiv Detail & Related papers (2022-12-07T18:55:53Z) - A Perspective on Robotic Telepresence and Teleoperation using Cognition:
Are we there yet? [0.0]
With the Artificial Intelligence (AI) revolution already being started, we can see a wide range of robotic applications being realized.
These technologies find significant application in health care, education, surveillance, disaster recovery, and corporate/government sectors.
But question still remains about their maturity, security and safety levels.
arXiv Detail & Related papers (2022-03-06T13:10:00Z) - Spatial Computing and Intuitive Interaction: Bringing Mixed Reality and
Robotics Together [68.44697646919515]
This paper presents several human-robot systems that utilize spatial computing to enable novel robot use cases.
The combination of spatial computing and egocentric sensing on mixed reality devices enables them to capture and understand human actions and translate these to actions with spatial meaning.
arXiv Detail & Related papers (2022-02-03T10:04:26Z) - OpenBot: Turning Smartphones into Robots [95.94432031144716]
Current robots are either expensive or make significant compromises on sensory richness, computational power, and communication capabilities.
We propose to leverage smartphones to equip robots with extensive sensor suites, powerful computational abilities, state-of-the-art communication channels, and access to a thriving software ecosystem.
We design a small electric vehicle that costs $50 and serves as a robot body for standard Android smartphones.
arXiv Detail & Related papers (2020-08-24T18:04:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.