All the Feels: A dexterous hand with large-area tactile sensing
- URL: http://arxiv.org/abs/2210.15658v3
- Date: Tue, 27 Feb 2024 15:43:00 GMT
- Title: All the Feels: A dexterous hand with large-area tactile sensing
- Authors: Raunaq Bhirangi, Abigail DeFranco, Jacob Adkins, Carmel Majidi,
Abhinav Gupta, Tess Hellebrekers, Vikash Kumar
- Abstract summary: High cost and lack of reliability has precluded the widespread adoption of dexterous hands in robotics.
The lack of a viable tactile sensor capable of sensing over the entire area of the hand impedes the rich, low-level feedback that would improve learning of dexterous manipulation skills.
This paper introduces an inexpensive, modular, robust, and scalable platform -- the DManus.
- Score: 23.631099756265996
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: High cost and lack of reliability has precluded the widespread adoption of
dexterous hands in robotics. Furthermore, the lack of a viable tactile sensor
capable of sensing over the entire area of the hand impedes the rich, low-level
feedback that would improve learning of dexterous manipulation skills. This
paper introduces an inexpensive, modular, robust, and scalable platform -- the
DManus -- aimed at resolving these challenges while satisfying the large-scale
data collection capabilities demanded by deep robot learning paradigms. Studies
on human manipulation point to the criticality of low-level tactile feedback in
performing everyday dexterous tasks. The DManus comes with ReSkin sensing on
the entire surface of the palm as well as the fingertips. We demonstrate
effectiveness of the fully integrated system in a tactile aware task -- bin
picking and sorting. Code, documentation, design files, detailed assembly
instructions, trained models, task videos, and all supplementary materials
required to recreate the setup can be found on
https://sites.google.com/view/roboticsbenchmarks/platforms/dmanus.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - 3D-ViTac: Learning Fine-Grained Manipulation with Visuo-Tactile Sensing [18.189782619503074]
This paper introduces textbf3D-ViTac, a multi-modal sensing and learning system for robots.
Our system features tactile sensors equipped with dense sensing units, each covering an area of 3$mm2$.
We show that even low-cost robots can perform precise manipulations and significantly outperform vision-only policies.
arXiv Detail & Related papers (2024-10-31T16:22:53Z) - Learning Visuotactile Skills with Two Multifingered Hands [80.99370364907278]
We explore learning from human demonstrations using a bimanual system with multifingered hands and visuotactile data.
Our results mark a promising step forward in bimanual multifingered manipulation from visuotactile data.
arXiv Detail & Related papers (2024-04-25T17:59:41Z) - A model-free approach to fingertip slip and disturbance detection for
grasp stability inference [0.0]
We propose a method for assessing grasp stability using tactile sensing.
We use highly sensitive Uskin tactile sensors mounted on an Allegro hand to test and validate our method.
arXiv Detail & Related papers (2023-11-22T09:04:26Z) - RoboPianist: Dexterous Piano Playing with Deep Reinforcement Learning [61.10744686260994]
We introduce RoboPianist, a system that enables simulated anthropomorphic hands to learn an extensive repertoire of 150 piano pieces.
We additionally introduce an open-sourced environment, benchmark of tasks, interpretable evaluation metrics, and open challenges for future study.
arXiv Detail & Related papers (2023-04-09T03:53:05Z) - Learning to Detect Slip with Barometric Tactile Sensors and a Temporal
Convolutional Neural Network [7.346580429118843]
We present a learning-based method to detect slip using barometric tactile sensors.
We train a temporal convolution neural network to detect slip, achieving high detection accuracies.
We argue that barometric tactile sensing technology, combined with data-driven learning, is suitable for many manipulation tasks such as slip compensation.
arXiv Detail & Related papers (2022-02-19T08:21:56Z) - Bayesian Imitation Learning for End-to-End Mobile Manipulation [80.47771322489422]
Augmenting policies with additional sensor inputs, such as RGB + depth cameras, is a straightforward approach to improving robot perception capabilities.
We show that using the Variational Information Bottleneck to regularize convolutional neural networks improves generalization to held-out domains.
We demonstrate that our method is able to help close the sim-to-real gap and successfully fuse RGB and depth modalities.
arXiv Detail & Related papers (2022-02-15T17:38:30Z) - Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors [7.35805050004643]
We present a learning-based method to detect slip using barometric tactile sensors.
We are able to achieve slip detection accuracies of greater than 91%.
We show that barometric tactile sensing technology, combined with data-driven learning, is potentially suitable for many complex manipulation tasks.
arXiv Detail & Related papers (2021-03-24T19:29:03Z) - Task-relevant Representation Learning for Networked Robotic Perception [74.0215744125845]
This paper presents an algorithm to learn task-relevant representations of sensory data that are co-designed with a pre-trained robotic perception model's ultimate objective.
Our algorithm aggressively compresses robotic sensory data by up to 11x more than competing methods.
arXiv Detail & Related papers (2020-11-06T07:39:08Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.