Attention for Robot Touch: Tactile Saliency Prediction for Robust
Sim-to-Real Tactile Control
- URL: http://arxiv.org/abs/2307.14510v2
- Date: Wed, 2 Aug 2023 09:42:58 GMT
- Title: Attention for Robot Touch: Tactile Saliency Prediction for Robust
Sim-to-Real Tactile Control
- Authors: Yijiong Lin, Mauro Comi, Alex Church, Dandan Zhang, Nathan F. Lepora
- Abstract summary: High-resolution tactile sensing can provide accurate information about local contact in contact-rich robotic tasks.
We study a new concept: textittactile saliency for robot touch, inspired by the human touch attention mechanism from neuroscience.
- Score: 12.302685367517718
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: High-resolution tactile sensing can provide accurate information about local
contact in contact-rich robotic tasks. However, the deployment of such tasks in
unstructured environments remains under-investigated. To improve the robustness
of tactile robot control in unstructured environments, we propose and study a
new concept: \textit{tactile saliency} for robot touch, inspired by the human
touch attention mechanism from neuroscience and the visual saliency prediction
problem from computer vision. In analogy to visual saliency, this concept
involves identifying key information in tactile images captured by a tactile
sensor. While visual saliency datasets are commonly annotated by humans,
manually labelling tactile images is challenging due to their counterintuitive
patterns. To address this challenge, we propose a novel approach comprised of
three interrelated networks: 1) a Contact Depth Network (ConDepNet), which
generates a contact depth map to localize deformation in a real tactile image
that contains target and noise features; 2) a Tactile Saliency Network
(TacSalNet), which predicts a tactile saliency map to describe the target areas
for an input contact depth map; 3) and a Tactile Noise Generator (TacNGen),
which generates noise features to train the TacSalNet. Experimental results in
contact pose estimation and edge-following in the presence of distractors
showcase the accurate prediction of target features from real tactile images.
Overall, our tactile saliency prediction approach gives robust sim-to-real
tactile control in environments with unknown distractors. Project page:
https://sites.google.com/view/tactile-saliency/.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - DexTouch: Learning to Seek and Manipulate Objects with Tactile Dexterity [12.508332341279177]
We introduce a multi-finger robot system designed to search for and manipulate objects using the sense of touch.
To achieve this, binary tactile sensors are implemented on one side of the robot hand to minimize the Sim2Real gap.
We demonstrate that object search and manipulation using tactile sensors is possible even in an environment without vision information.
arXiv Detail & Related papers (2024-01-23T05:37:32Z) - Controllable Visual-Tactile Synthesis [28.03469909285511]
We develop a conditional generative model that synthesizes both visual and tactile outputs from a single sketch.
We then introduce a pipeline to render high-quality visual and tactile outputs on an electroadhesion-based haptic device.
arXiv Detail & Related papers (2023-05-04T17:59:51Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Touch and Go: Learning from Human-Collected Vision and Touch [16.139106833276]
We propose a dataset with paired visual and tactile data called Touch and Go.
Human data collectors probe objects in natural environments using tactile sensors.
Our dataset spans a large number of "in the wild" objects and scenes.
arXiv Detail & Related papers (2022-11-22T18:59:32Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Active 3D Shape Reconstruction from Vision and Touch [66.08432412497443]
Humans build 3D understandings of the world through active object exploration, using jointly their senses of vision and touch.
In 3D shape reconstruction, most recent progress has relied on static datasets of limited sensory data such as RGB images, depth maps or haptic readings.
We introduce a system composed of: 1) a haptic simulator leveraging high spatial resolution vision-based tactile sensors for active touching of 3D objects; 2) a mesh-based 3D shape reconstruction model that relies on tactile or visuotactile priors to guide the shape exploration; and 3) a set of data-driven solutions with either tactile or visuo
arXiv Detail & Related papers (2021-07-20T15:56:52Z) - Spatio-temporal Attention Model for Tactile Texture Recognition [25.06942319117782]
We propose a novel Spatio-Temporal Attention Model (STAM) for tactile texture recognition.
The proposed STAM pays attention to both spatial focus of each single tactile texture and the temporal correlation of a tactile sequence.
In the experiments to discriminate 100 different fabric textures, the spatially and temporally selective attention has resulted in a significant improvement of the recognition accuracy.
arXiv Detail & Related papers (2020-08-10T22:32:34Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.