Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors
- URL: http://arxiv.org/abs/2103.13460v1
- Date: Wed, 24 Mar 2021 19:29:03 GMT
- Title: Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors
- Authors: Abhinav Grover, Christopher Grebe, Philippe Nadeau, Jonathan Kelly
- Abstract summary: We present a learning-based method to detect slip using barometric tactile sensors.
We are able to achieve slip detection accuracies of greater than 91%.
We show that barometric tactile sensing technology, combined with data-driven learning, is potentially suitable for many complex manipulation tasks.
- Score: 7.35805050004643
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The ability to perceive object slip through tactile feedback allows humans to
accomplish complex manipulation tasks including maintaining a stable grasp.
Despite the utility of tactile information for many robotics applications,
tactile sensors have yet to be widely deployed in industrial settings -- part
of the challenge lies in identifying slip and other key events from the tactile
data stream. In this paper, we present a learning-based method to detect slip
using barometric tactile sensors. These sensors have many desirable properties
including high reliability and durability, and are built from very inexpensive
components. We are able to achieve slip detection accuracies of greater than
91% while displaying robustness to the speed and direction of the slip motion.
Further, we test our detector on two robot manipulation tasks involving a
variety of common objects and demonstrate successful generalization to
real-world scenarios not seen during training. We show that barometric tactile
sensing technology, combined with data-driven learning, is potentially suitable
for many complex manipulation tasks such as slip compensation.
Related papers
- Learning Visuotactile Skills with Two Multifingered Hands [80.99370364907278]
We explore learning from human demonstrations using a bimanual system with multifingered hands and visuotactile data.
Our results mark a promising step forward in bimanual multifingered manipulation from visuotactile data.
arXiv Detail & Related papers (2024-04-25T17:59:41Z) - A model-free approach to fingertip slip and disturbance detection for
grasp stability inference [0.0]
We propose a method for assessing grasp stability using tactile sensing.
We use highly sensitive Uskin tactile sensors mounted on an Allegro hand to test and validate our method.
arXiv Detail & Related papers (2023-11-22T09:04:26Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Learning to Detect Slip through Tactile Estimation of the Contact Force Field and its Entropy [6.739132519488627]
We introduce a physics-informed, data-driven approach to detect slip continuously in real time.
We employ the GelSight Mini, an optical tactile sensor, attached to custom-designed grippers to gather tactile data.
Our results show that the best classification algorithm achieves a high average accuracy of 95.61%.
arXiv Detail & Related papers (2023-03-02T03:16:21Z) - TANDEM: Learning Joint Exploration and Decision Making with Tactile
Sensors [15.418884994244996]
We focus on the process of guiding tactile exploration, and its interplay with task-related decision making.
We propose TANDEM, an architecture to learn efficient exploration strategies in conjunction with decision making.
We demonstrate this method on a tactile object recognition task, where a robot equipped with a touch sensor must explore and identify an object from a known set based on tactile feedback alone.
arXiv Detail & Related papers (2022-03-01T23:55:09Z) - Learning to Detect Slip with Barometric Tactile Sensors and a Temporal
Convolutional Neural Network [7.346580429118843]
We present a learning-based method to detect slip using barometric tactile sensors.
We train a temporal convolution neural network to detect slip, achieving high detection accuracies.
We argue that barometric tactile sensing technology, combined with data-driven learning, is suitable for many manipulation tasks such as slip compensation.
arXiv Detail & Related papers (2022-02-19T08:21:56Z) - Bayesian Imitation Learning for End-to-End Mobile Manipulation [80.47771322489422]
Augmenting policies with additional sensor inputs, such as RGB + depth cameras, is a straightforward approach to improving robot perception capabilities.
We show that using the Variational Information Bottleneck to regularize convolutional neural networks improves generalization to held-out domains.
We demonstrate that our method is able to help close the sim-to-real gap and successfully fuse RGB and depth modalities.
arXiv Detail & Related papers (2022-02-15T17:38:30Z) - Leveraging distributed contact force measurements for slip detection: a
physics-based approach enabled by a data-driven tactile sensor [5.027571997864706]
This paper describes a novel model-based slip detection pipeline that can predict possibly failing grasps in real-time.
A vision-based tactile sensor that accurately estimates distributed forces was integrated into a grasping setup composed of a six degrees-of-freedom cobot and a two-finger gripper.
Results show that the system can reliably predict slip while manipulating objects of different shapes, materials, and weights.
arXiv Detail & Related papers (2021-09-23T17:12:46Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - PyTouch: A Machine Learning Library for Touch Processing [68.32055581488557]
We present PyTouch, the first machine learning library dedicated to the processing of touch sensing signals.
PyTouch is designed to be modular, easy-to-use and provides state-of-the-art touch processing capabilities as a service.
We evaluate PyTouch on real-world data from several tactile sensors on touch processing tasks such as touch detection, slip and object pose estimations.
arXiv Detail & Related papers (2021-05-26T18:55:18Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.