OmniTact: A Multi-Directional High Resolution Touch Sensor
- URL: http://arxiv.org/abs/2003.06965v1
- Date: Mon, 16 Mar 2020 01:31:29 GMT
- Title: OmniTact: A Multi-Directional High Resolution Touch Sensor
- Authors: Akhil Padmanabha, Frederik Ebert, Stephen Tian, Roberto Calandra,
Chelsea Finn, Sergey Levine
- Abstract summary: Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
- Score: 109.28703530853542
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Incorporating touch as a sensing modality for robots can enable finer and
more robust manipulation skills. Existing tactile sensors are either flat, have
small sensitive fields or only provide low-resolution signals. In this paper,
we introduce OmniTact, a multi-directional high-resolution tactile sensor.
OmniTact is designed to be used as a fingertip for robotic manipulation with
robotic hands, and uses multiple micro-cameras to detect multi-directional
deformations of a gel-based skin. This provides a rich signal from which a
variety of different contact state variables can be inferred using modern image
processing and computer vision methods. We evaluate the capabilities of
OmniTact on a challenging robotic control task that requires inserting an
electrical connector into an outlet, as well as a state estimation problem that
is representative of those typically encountered in dexterous robotic
manipulation, where the goal is to infer the angle of contact of a curved
finger pressing against an object. Both tasks are performed using only touch
sensing and deep convolutional neural networks to process images from the
sensor's cameras. We compare with a state-of-the-art tactile sensor that is
only sensitive on one side, as well as a state-of-the-art multi-directional
tactile sensor, and find that OmniTact's combination of high-resolution and
multi-directional sensing is crucial for reliably inserting the electrical
connector and allows for higher accuracy in the state estimation task. Videos
and supplementary material can be found at
https://sites.google.com/berkeley.edu/omnitact
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Binding Touch to Everything: Learning Unified Multimodal Tactile
Representations [29.76008953177392]
We introduce UniTouch, a unified model for vision-based touch sensors connected to multiple modalities.
We achieve this by aligning our UniTouch embeddings to pretrained image embeddings already associated with a variety of other modalities.
We further propose learnable sensor-specific tokens, allowing the model to learn from a set of heterogeneous tactile sensors.
arXiv Detail & Related papers (2024-01-31T18:59:57Z) - Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation [20.713880984921385]
Evetac is an event-based optical tactile sensor.
We develop touch processing algorithms to process its measurements online at 1000 Hz.
Evetac's output and the marker tracking provide meaningful features for learning data-driven slip detection and prediction models.
arXiv Detail & Related papers (2023-12-02T22:01:49Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation [49.925499720323806]
We study how visual, auditory, and tactile perception can jointly help robots to solve complex manipulation tasks.
We build a robot system that can see with a camera, hear with a contact microphone, and feel with a vision-based tactile sensor.
arXiv Detail & Related papers (2022-12-07T18:55:53Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - PyTouch: A Machine Learning Library for Touch Processing [68.32055581488557]
We present PyTouch, the first machine learning library dedicated to the processing of touch sensing signals.
PyTouch is designed to be modular, easy-to-use and provides state-of-the-art touch processing capabilities as a service.
We evaluate PyTouch on real-world data from several tactile sensors on touch processing tasks such as touch detection, slip and object pose estimations.
arXiv Detail & Related papers (2021-05-26T18:55:18Z) - DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile
Sensor with Application to In-Hand Manipulation [16.54834671357377]
General purpose in-hand manipulation remains one of the unsolved challenges of robotics.
We introduce DIGIT, an inexpensive, compact, and high-resolution tactile sensor geared towards in-hand manipulation.
arXiv Detail & Related papers (2020-05-29T17:07:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.