Learning to Detect Slip with Barometric Tactile Sensors and a Temporal
Convolutional Neural Network
- URL: http://arxiv.org/abs/2202.09549v1
- Date: Sat, 19 Feb 2022 08:21:56 GMT
- Title: Learning to Detect Slip with Barometric Tactile Sensors and a Temporal
Convolutional Neural Network
- Authors: Abhinav Grover and Philippe Nadeau and Christopher Grebe and Jonathan
Kelly
- Abstract summary: We present a learning-based method to detect slip using barometric tactile sensors.
We train a temporal convolution neural network to detect slip, achieving high detection accuracies.
We argue that barometric tactile sensing technology, combined with data-driven learning, is suitable for many manipulation tasks such as slip compensation.
- Score: 7.346580429118843
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The ability to perceive object slip via tactile feedback enables humans to
accomplish complex manipulation tasks including maintaining a stable grasp.
Despite the utility of tactile information for many applications, tactile
sensors have yet to be widely deployed in industrial robotics settings; part of
the challenge lies in identifying slip and other events from the tactile data
stream. In this paper, we present a learning-based method to detect slip using
barometric tactile sensors. These sensors have many desirable properties
including high durability and reliability, and are built from inexpensive,
off-the-shelf components. We train a temporal convolution neural network to
detect slip, achieving high detection accuracies while displaying robustness to
the speed and direction of the slip motion. Further, we test our detector on
two manipulation tasks involving a variety of common objects and demonstrate
successful generalization to real-world scenarios not seen during training. We
argue that barometric tactile sensing technology, combined with data-driven
learning, is suitable for many manipulation tasks such as slip compensation.
Related papers
- Learning Visuotactile Skills with Two Multifingered Hands [80.99370364907278]
We explore learning from human demonstrations using a bimanual system with multifingered hands and visuotactile data.
Our results mark a promising step forward in bimanual multifingered manipulation from visuotactile data.
arXiv Detail & Related papers (2024-04-25T17:59:41Z) - A model-free approach to fingertip slip and disturbance detection for
grasp stability inference [0.0]
We propose a method for assessing grasp stability using tactile sensing.
We use highly sensitive Uskin tactile sensors mounted on an Allegro hand to test and validate our method.
arXiv Detail & Related papers (2023-11-22T09:04:26Z) - Agile gesture recognition for capacitive sensing devices: adapting
on-the-job [55.40855017016652]
We demonstrate a hand gesture recognition system that uses signals from capacitive sensors embedded into the etee hand controller.
The controller generates real-time signals from each of the wearer five fingers.
We use a machine learning technique to analyse the time series signals and identify three features that can represent 5 fingers within 500 ms.
arXiv Detail & Related papers (2023-05-12T17:24:02Z) - Recognizing Complex Gestures on Minimalistic Knitted Sensors: Toward
Real-World Interactive Systems [0.13048920509133805]
Our digitally-knitted capacitive active sensors can be manufactured at scale with little human intervention.
This work advances the capabilities of such sensors by creating the foundation for an interactive gesture recognition system.
arXiv Detail & Related papers (2023-03-18T04:57:46Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Learning to Detect Slip through Tactile Estimation of the Contact Force Field and its Entropy [6.739132519488627]
We introduce a physics-informed, data-driven approach to detect slip continuously in real time.
We employ the GelSight Mini, an optical tactile sensor, attached to custom-designed grippers to gather tactile data.
Our results show that the best classification algorithm achieves a high average accuracy of 95.61%.
arXiv Detail & Related papers (2023-03-02T03:16:21Z) - Leveraging distributed contact force measurements for slip detection: a
physics-based approach enabled by a data-driven tactile sensor [5.027571997864706]
This paper describes a novel model-based slip detection pipeline that can predict possibly failing grasps in real-time.
A vision-based tactile sensor that accurately estimates distributed forces was integrated into a grasping setup composed of a six degrees-of-freedom cobot and a two-finger gripper.
Results show that the system can reliably predict slip while manipulating objects of different shapes, materials, and weights.
arXiv Detail & Related papers (2021-09-23T17:12:46Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors [7.35805050004643]
We present a learning-based method to detect slip using barometric tactile sensors.
We are able to achieve slip detection accuracies of greater than 91%.
We show that barometric tactile sensing technology, combined with data-driven learning, is potentially suitable for many complex manipulation tasks.
arXiv Detail & Related papers (2021-03-24T19:29:03Z) - Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision
Action Recognition [131.6328804788164]
We propose a framework, named Semantics-aware Adaptive Knowledge Distillation Networks (SAKDN), to enhance action recognition in vision-sensor modality (videos)
The SAKDN uses multiple wearable-sensors as teacher modalities and uses RGB videos as student modality.
arXiv Detail & Related papers (2020-09-01T03:38:31Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.