Spatio-temporal Attention Model for Tactile Texture Recognition
- URL: http://arxiv.org/abs/2008.04442v1
- Date: Mon, 10 Aug 2020 22:32:34 GMT
- Title: Spatio-temporal Attention Model for Tactile Texture Recognition
- Authors: Guanqun Cao, Yi Zhou, Danushka Bollegala and Shan Luo
- Abstract summary: We propose a novel Spatio-Temporal Attention Model (STAM) for tactile texture recognition.
The proposed STAM pays attention to both spatial focus of each single tactile texture and the temporal correlation of a tactile sequence.
In the experiments to discriminate 100 different fabric textures, the spatially and temporally selective attention has resulted in a significant improvement of the recognition accuracy.
- Score: 25.06942319117782
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, tactile sensing has attracted great interest in robotics,
especially for facilitating exploration of unstructured environments and
effective manipulation. A detailed understanding of the surface textures via
tactile sensing is essential for many of these tasks. Previous works on texture
recognition using camera based tactile sensors have been limited to treating
all regions in one tactile image or all samples in one tactile sequence
equally, which includes much irrelevant or redundant information. In this
paper, we propose a novel Spatio-Temporal Attention Model (STAM) for tactile
texture recognition, which is the very first of its kind to our best knowledge.
The proposed STAM pays attention to both spatial focus of each single tactile
texture and the temporal correlation of a tactile sequence. In the experiments
to discriminate 100 different fabric textures, the spatially and temporally
selective attention has resulted in a significant improvement of the
recognition accuracy, by up to 18.8%, compared to the non-attention based
models. Specifically, after introducing noisy data that is collected before the
contact happens, our proposed STAM can learn the salient features efficiently
and the accuracy can increase by 15.23% on average compared with the CNN based
baseline approach. The improved tactile texture perception can be applied to
facilitate robot tasks like grasping and manipulation.
Related papers
- What Matters for Active Texture Recognition With Vision-Based Tactile Sensors [17.019982978396122]
We formalize the active sampling problem in the context of tactile fabric recognition.
We investigate which components are crucial for quick and reliable texture recognition.
Our best approach reaches 90.0% in under 5 touches, highlighting that vision-based tactile sensors are highly effective for fabric texture recognition.
arXiv Detail & Related papers (2024-03-20T16:06:01Z) - Keypoint Description by Symmetry Assessment -- Applications in
Biometrics [49.547569925407814]
We present a model-based feature extractor to describe neighborhoods around keypoints by finite expansion.
The iso-curves of such functions are highly symmetric w.r.t. the origin (a keypoint) and the estimated parameters have well defined geometric interpretations.
arXiv Detail & Related papers (2023-11-03T00:49:25Z) - Attention for Robot Touch: Tactile Saliency Prediction for Robust
Sim-to-Real Tactile Control [12.302685367517718]
High-resolution tactile sensing can provide accurate information about local contact in contact-rich robotic tasks.
We study a new concept: textittactile saliency for robot touch, inspired by the human touch attention mechanism from neuroscience.
arXiv Detail & Related papers (2023-07-26T21:19:45Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - VisTaNet: Attention Guided Deep Fusion for Surface Roughness
Classification [0.0]
This paper presents a visual dataset that augments an existing tactile dataset.
We propose a novel deep fusion architecture that fuses visual and tactile data using four types of fusion strategies.
Our model shows significant performance improvements (97.22%) in surface roughness classification accuracy over tactile only.
arXiv Detail & Related papers (2022-09-18T09:37:06Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Active 3D Shape Reconstruction from Vision and Touch [66.08432412497443]
Humans build 3D understandings of the world through active object exploration, using jointly their senses of vision and touch.
In 3D shape reconstruction, most recent progress has relied on static datasets of limited sensory data such as RGB images, depth maps or haptic readings.
We introduce a system composed of: 1) a haptic simulator leveraging high spatial resolution vision-based tactile sensors for active touching of 3D objects; 2) a mesh-based 3D shape reconstruction model that relies on tactile or visuotactile priors to guide the shape exploration; and 3) a set of data-driven solutions with either tactile or visuo
arXiv Detail & Related papers (2021-07-20T15:56:52Z) - TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile
Object Recognition [17.37142241982902]
New advances in flexible, event-driven, electronic skins may soon endow robots with touch perception capabilities similar to humans.
These unique features may render current deep learning approaches such as convolutional feature extractors unsuitable for tactile learning.
We propose a novel spiking graph neural network for event-based tactile object recognition.
arXiv Detail & Related papers (2020-08-01T03:35:15Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.