Exploiting High Quality Tactile Sensors for Simplified Grasping
- URL: http://arxiv.org/abs/2207.12360v1
- Date: Mon, 25 Jul 2022 17:19:37 GMT
- Title: Exploiting High Quality Tactile Sensors for Simplified Grasping
- Authors: Pedro Machado, T.M. McGinnity
- Abstract summary: We present a detailed analysis of the use of two different types of commercially available robotic fingertips.
We conclude that a simple algorithm based on a proportional controller will suffice for many grasping applications.
- Score: 1.713291434132985
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Robots are expected to grasp a wide range of objects varying in shape, weight
or material type. Providing robots with tactile capabilities similar to humans
is thus essential for applications involving human-to-robot or robot-to-robot
interactions, particularly in those situations where a robot is expected to
grasp and manipulate complex objects not previously encountered. A critical
aspect for successful object grasp and manipulation is the use of high-quality
fingertips equipped with multiple high-performance sensors, distributed
appropriately across a specific contact surface.
In this paper, we present a detailed analysis of the use of two different
types of commercially available robotic fingertips (BioTac and WTS-FT), each of
which is equipped with multiple sensors distributed across the fingertips'
contact surface. We further demonstrate that, due to the high performance of
the fingertips, a complex adaptive grasping algorithm is not required for
grasping of everyday objects. We conclude that a simple algorithm based on a
proportional controller will suffice for many grasping applications, provided
the relevant fingertips exhibit high sensitivity. In a quantified assessment,
we also demonstrate that, due in part to the sensor distribution, the
BioTac-based fingertip performs better than the WTS-FT device, in enabling
lifting of loads up to 850g, and that the simple proportional controller can
adapt the grasp even when the object is exposed to significant external
vibrational challenges.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Capturing complex hand movements and object interactions using machine learning-powered stretchable smart textile gloves [9.838013581109681]
Real-time tracking of dexterous hand movements has numerous applications in human-computer interaction, metaverse, robotics, and tele-health.
Here, we report accurate and dynamic tracking of articulated hand and finger movements using stretchable, washable smart gloves with embedded helical sensor yarns and inertial measurement units.
The sensor yarns have a high dynamic range, responding to low 0.005 % to high 155 % strains, and show stability during extensive use and washing cycles.
arXiv Detail & Related papers (2024-10-03T05:32:16Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Bayesian Imitation Learning for End-to-End Mobile Manipulation [80.47771322489422]
Augmenting policies with additional sensor inputs, such as RGB + depth cameras, is a straightforward approach to improving robot perception capabilities.
We show that using the Variational Information Bottleneck to regularize convolutional neural networks improves generalization to held-out domains.
We demonstrate that our method is able to help close the sim-to-real gap and successfully fuse RGB and depth modalities.
arXiv Detail & Related papers (2022-02-15T17:38:30Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - AuraSense: Robot Collision Avoidance by Full Surface Proximity Detection [3.9770080498150224]
AuraSense is the first system to realize no-dead-spot proximity sensing for robot arms.
It requires only a single pair of piezoelectric transducers, and can easily be applied to off-the-shelf robots.
arXiv Detail & Related papers (2021-08-10T18:37:54Z) - Under Pressure: Learning to Detect Slip with Barometric Tactile Sensors [7.35805050004643]
We present a learning-based method to detect slip using barometric tactile sensors.
We are able to achieve slip detection accuracies of greater than 91%.
We show that barometric tactile sensing technology, combined with data-driven learning, is potentially suitable for many complex manipulation tasks.
arXiv Detail & Related papers (2021-03-24T19:29:03Z) - SensiX: A Platform for Collaborative Machine Learning on the Edge [69.1412199244903]
We present SensiX, a personal edge platform that stays between sensor data and sensing models.
We demonstrate its efficacy in developing motion and audio-based multi-device sensing systems.
Our evaluation shows that SensiX offers a 7-13% increase in overall accuracy and up to 30% increase across different environment dynamics at the expense of 3mW power overhead.
arXiv Detail & Related papers (2020-12-04T23:06:56Z) - Human Haptic Gesture Interpretation for Robotic Systems [3.888848425698769]
Physical human-robot interactions (pHRI) are less efficient and communicative than human-human interactions.
A key reason is a lack of informative sense of touch in robotic systems.
This work presents four proposed touch gesture classes that cover the majority of the gesture characteristics identified in the literature.
arXiv Detail & Related papers (2020-12-03T14:33:57Z) - DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile
Sensor with Application to In-Hand Manipulation [16.54834671357377]
General purpose in-hand manipulation remains one of the unsolved challenges of robotics.
We introduce DIGIT, an inexpensive, compact, and high-resolution tactile sensor geared towards in-hand manipulation.
arXiv Detail & Related papers (2020-05-29T17:07:54Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.