DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile
Sensor with Application to In-Hand Manipulation
- URL: http://arxiv.org/abs/2005.14679v1
- Date: Fri, 29 May 2020 17:07:54 GMT
- Title: DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile
Sensor with Application to In-Hand Manipulation
- Authors: Mike Lambeta and Po-Wei Chou and Stephen Tian and Brian Yang and
Benjamin Maloon and Victoria Rose Most and Dave Stroud and Raymond Santos and
Ahmad Byagowi and Gregg Kammerer and Dinesh Jayaraman and Roberto Calandra
- Abstract summary: General purpose in-hand manipulation remains one of the unsolved challenges of robotics.
We introduce DIGIT, an inexpensive, compact, and high-resolution tactile sensor geared towards in-hand manipulation.
- Score: 16.54834671357377
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite decades of research, general purpose in-hand manipulation remains one
of the unsolved challenges of robotics. One of the contributing factors that
limit current robotic manipulation systems is the difficulty of precisely
sensing contact forces -- sensing and reasoning about contact forces are
crucial to accurately control interactions with the environment. As a step
towards enabling better robotic manipulation, we introduce DIGIT, an
inexpensive, compact, and high-resolution tactile sensor geared towards in-hand
manipulation. DIGIT improves upon past vision-based tactile sensors by
miniaturizing the form factor to be mountable on multi-fingered hands, and by
providing several design improvements that result in an easier, more repeatable
manufacturing process, and enhanced reliability. We demonstrate the
capabilities of the DIGIT sensor by training deep neural network model-based
controllers to manipulate glass marbles in-hand with a multi-finger robotic
hand. To provide the robotic community access to reliable and low-cost tactile
sensors, we open-source the DIGIT design at https://digit.ml/.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Learning Visuotactile Skills with Two Multifingered Hands [80.99370364907278]
We explore learning from human demonstrations using a bimanual system with multifingered hands and visuotactile data.
Our results mark a promising step forward in bimanual multifingered manipulation from visuotactile data.
arXiv Detail & Related papers (2024-04-25T17:59:41Z) - Recognizing Complex Gestures on Minimalistic Knitted Sensors: Toward
Real-World Interactive Systems [0.13048920509133805]
Our digitally-knitted capacitive active sensors can be manufactured at scale with little human intervention.
This work advances the capabilities of such sensors by creating the foundation for an interactive gesture recognition system.
arXiv Detail & Related papers (2023-03-18T04:57:46Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Dexterous Manipulation from Images: Autonomous Real-World RL via Substep
Guidance [71.36749876465618]
We describe a system for vision-based dexterous manipulation that provides a "programming-free" approach for users to define new tasks.
Our system includes a framework for users to define a final task and intermediate sub-tasks with image examples.
experimental results with a four-finger robotic hand learning multi-stage object manipulation tasks directly in the real world.
arXiv Detail & Related papers (2022-12-19T22:50:40Z) - Exploiting High Quality Tactile Sensors for Simplified Grasping [1.713291434132985]
We present a detailed analysis of the use of two different types of commercially available robotic fingertips.
We conclude that a simple algorithm based on a proportional controller will suffice for many grasping applications.
arXiv Detail & Related papers (2022-07-25T17:19:37Z) - Bayesian Imitation Learning for End-to-End Mobile Manipulation [80.47771322489422]
Augmenting policies with additional sensor inputs, such as RGB + depth cameras, is a straightforward approach to improving robot perception capabilities.
We show that using the Variational Information Bottleneck to regularize convolutional neural networks improves generalization to held-out domains.
We demonstrate that our method is able to help close the sim-to-real gap and successfully fuse RGB and depth modalities.
arXiv Detail & Related papers (2022-02-15T17:38:30Z) - AuraSense: Robot Collision Avoidance by Full Surface Proximity Detection [3.9770080498150224]
AuraSense is the first system to realize no-dead-spot proximity sensing for robot arms.
It requires only a single pair of piezoelectric transducers, and can easily be applied to off-the-shelf robots.
arXiv Detail & Related papers (2021-08-10T18:37:54Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.