Learning In-Hand Translation Using Tactile Skin With Shear and Normal Force Sensing
- URL: http://arxiv.org/abs/2407.07885v1
- Date: Wed, 10 Jul 2024 17:52:30 GMT
- Title: Learning In-Hand Translation Using Tactile Skin With Shear and Normal Force Sensing
- Authors: Jessica Yin, Haozhi Qi, Jitendra Malik, James Pikul, Mark Yim, Tess Hellebrekers,
- Abstract summary: We introduce a sensor model for tactile skin that enables zero-shot sim-to-real transfer of ternary shear and binary normal forces.
We conduct extensive real-world experiments to assess how tactile sensing facilitates policy adaptation to various unseen object properties.
- Score: 43.269672740168396
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent progress in reinforcement learning (RL) and tactile sensing has significantly advanced dexterous manipulation. However, these methods often utilize simplified tactile signals due to the gap between tactile simulation and the real world. We introduce a sensor model for tactile skin that enables zero-shot sim-to-real transfer of ternary shear and binary normal forces. Using this model, we develop an RL policy that leverages sliding contact for dexterous in-hand translation. We conduct extensive real-world experiments to assess how tactile sensing facilitates policy adaptation to various unseen object properties and robot hand orientations. We demonstrate that our 3-axis tactile policies consistently outperform baselines that use only shear forces, only normal forces, or only proprioception. Website: https://jessicayin.github.io/tactile-skin-rl/
Related papers
- AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch [9.606323817785114]
We present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch.
We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer.
arXiv Detail & Related papers (2024-05-12T22:51:35Z) - Rotating without Seeing: Towards In-hand Dexterity through Touch [43.87509744768282]
We present Touch Dexterity, a new system that can perform in-hand object rotation using only touching without seeing the object.
Instead of relying on precise tactile sensing in a small region, we introduce a new system design using dense binary force sensors (touch or no touch) overlaying one side of the whole robot hand.
We train an in-hand rotation policy using Reinforcement Learning on diverse objects in simulation. Relying on touch-only sensing, we can directly deploy the policy in a real robot hand and rotate novel objects that are not presented in training.
arXiv Detail & Related papers (2023-03-20T05:38:30Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Force-Aware Interface via Electromyography for Natural VR/AR Interaction [69.1332992637271]
We design a learning-based neural interface for natural and intuitive force inputs in VR/AR.
We show that our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration.
We envision our findings to push forward research towards more realistic physicality in future VR/AR.
arXiv Detail & Related papers (2022-10-03T20:51:25Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Optical Tactile Sim-to-Real Policy Transfer via Real-to-Sim Tactile
Image Translation [21.82940445333913]
We present a suite of simulated environments tailored towards tactile robotics and reinforcement learning.
A data-driven approach enables translation of the current state of a real tactile sensor to corresponding simulated depth images.
This policy is implemented within a real-time control loop on a physical robot to demonstrate zero-shot sim-to-real policy transfer.
arXiv Detail & Related papers (2021-06-16T13:58:35Z) - Zero-shot sim-to-real transfer of tactile control policies for
aggressive swing-up manipulation [5.027571997864706]
This paper shows that robots equipped with a vision-based tactile sensor can perform dynamic manipulation tasks without prior knowledge of all the physical attributes of the objects to be manipulated.
A robotic system is presented that is able to swing up poles of different masses, radii and lengths, to an angle of 180 degrees.
This is the first work where a feedback policy from high-dimensional tactile observations is used to control the swing-up manipulation of poles in closed-loop.
arXiv Detail & Related papers (2021-01-07T18:43:18Z) - TACTO: A Fast, Flexible and Open-source Simulator for High-Resolution
Vision-based Tactile Sensors [8.497185333795477]
TACTO is a fast, flexible and open-source simulator for vision-based tactile sensors.
It can render realistic high-resolution touch readings at hundreds of frames per second.
We demonstrate TACTO on a perceptual task, by learning to predict grasp stability using touch from 1 million grasps.
arXiv Detail & Related papers (2020-12-15T17:54:07Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.