Learning In-Hand Translation Using Tactile Skin With Shear and Normal Force Sensing
- URL: http://arxiv.org/abs/2407.07885v1
- Date: Wed, 10 Jul 2024 17:52:30 GMT
- Title: Learning In-Hand Translation Using Tactile Skin With Shear and Normal Force Sensing
- Authors: Jessica Yin, Haozhi Qi, Jitendra Malik, James Pikul, Mark Yim, Tess Hellebrekers,
- Abstract summary: We introduce a sensor model for tactile skin that enables zero-shot sim-to-real transfer of ternary shear and binary normal forces.
We conduct extensive real-world experiments to assess how tactile sensing facilitates policy adaptation to various unseen object properties.
- Score: 43.269672740168396
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent progress in reinforcement learning (RL) and tactile sensing has significantly advanced dexterous manipulation. However, these methods often utilize simplified tactile signals due to the gap between tactile simulation and the real world. We introduce a sensor model for tactile skin that enables zero-shot sim-to-real transfer of ternary shear and binary normal forces. Using this model, we develop an RL policy that leverages sliding contact for dexterous in-hand translation. We conduct extensive real-world experiments to assess how tactile sensing facilitates policy adaptation to various unseen object properties and robot hand orientations. We demonstrate that our 3-axis tactile policies consistently outperform baselines that use only shear forces, only normal forces, or only proprioception. Website: https://jessicayin.github.io/tactile-skin-rl/
Related papers
- AnySkin: Plug-and-play Skin Sensing for Robotic Touch [15.126846563910814]
We address the challenges that impede the use of tactile sensing -- versatility, replaceability, and data reusability.
This work makes three key contributions: first, we introduce a streamlined fabrication process for creating an adhesive-free, durable and easily replaceable magnetic tactile sensor; second, we characterize slip detection and policy learning with the AnySkin sensor; and third, we demonstrate zero-shot generalization of models trained on one instance of AnySkin to new instances.
arXiv Detail & Related papers (2024-09-12T17:59:44Z) - AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch [9.606323817785114]
We present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch.
Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction.
Rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy.
arXiv Detail & Related papers (2024-05-12T22:51:35Z) - VITaL Pretraining: Visuo-Tactile Pretraining for Tactile and Non-Tactile Manipulation Policies [8.187196813233362]
We show how we can incorporate tactile information into imitation learning platforms to improve performance on manipulation tasks.
We show that incorporating visuo-tactile pretraining improves imitation learning performance, not only for tactile agents.
arXiv Detail & Related papers (2024-03-18T15:56:44Z) - Rotating without Seeing: Towards In-hand Dexterity through Touch [43.87509744768282]
We present Touch Dexterity, a new system that can perform in-hand object rotation using only touching without seeing the object.
Instead of relying on precise tactile sensing in a small region, we introduce a new system design using dense binary force sensors (touch or no touch) overlaying one side of the whole robot hand.
We train an in-hand rotation policy using Reinforcement Learning on diverse objects in simulation. Relying on touch-only sensing, we can directly deploy the policy in a real robot hand and rotate novel objects that are not presented in training.
arXiv Detail & Related papers (2023-03-20T05:38:30Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Force-Aware Interface via Electromyography for Natural VR/AR Interaction [69.1332992637271]
We design a learning-based neural interface for natural and intuitive force inputs in VR/AR.
We show that our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration.
We envision our findings to push forward research towards more realistic physicality in future VR/AR.
arXiv Detail & Related papers (2022-10-03T20:51:25Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - TACTO: A Fast, Flexible and Open-source Simulator for High-Resolution
Vision-based Tactile Sensors [8.497185333795477]
TACTO is a fast, flexible and open-source simulator for vision-based tactile sensors.
It can render realistic high-resolution touch readings at hundreds of frames per second.
We demonstrate TACTO on a perceptual task, by learning to predict grasp stability using touch from 1 million grasps.
arXiv Detail & Related papers (2020-12-15T17:54:07Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.