AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch
- URL: http://arxiv.org/abs/2405.07391v2
- Date: Wed, 12 Jun 2024 03:25:44 GMT
- Title: AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch
- Authors: Max Yang, Chenghua Lu, Alex Church, Yijiong Lin, Chris Ford, Haoran Li, Efi Psomopoulou, David A. W. Barton, Nathan F. Lepora,
- Abstract summary: We present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch.
We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer.
- Score: 9.606323817785114
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Human hands are capable of in-hand manipulation in the presence of different hand motions. For a robot hand, harnessing rich tactile information to achieve this level of dexterity still remains a significant challenge. In this paper, we present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer. Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction. In our experiments, we highlight the benefit of capturing detailed contact information when handling objects with varying properties. Interestingly, despite not having explicit slip detection, we found rich multi-fingered tactile sensing can implicitly detect object movement within grasp and provide a reactive behavior that improves the robustness of the policy. The project website can be found at https://maxyang27896.github.io/anyrotate/.
Related papers
- Learning In-Hand Translation Using Tactile Skin With Shear and Normal Force Sensing [43.269672740168396]
We introduce a sensor model for tactile skin that enables zero-shot sim-to-real transfer of ternary shear and binary normal forces.
We conduct extensive real-world experiments to assess how tactile sensing facilitates policy adaptation to various unseen object properties.
arXiv Detail & Related papers (2024-07-10T17:52:30Z) - Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - General In-Hand Object Rotation with Vision and Touch [46.871539289388615]
We introduce RotateIt, a system that enables fingertip-based object rotation along multiple axes.
We distill it to operate on realistic yet noisy simulated visuotactile and proprioceptive sensory inputs.
arXiv Detail & Related papers (2023-09-18T17:59:25Z) - Rotating without Seeing: Towards In-hand Dexterity through Touch [43.87509744768282]
We present Touch Dexterity, a new system that can perform in-hand object rotation using only touching without seeing the object.
Instead of relying on precise tactile sensing in a small region, we introduce a new system design using dense binary force sensors (touch or no touch) overlaying one side of the whole robot hand.
We train an in-hand rotation policy using Reinforcement Learning on diverse objects in simulation. Relying on touch-only sensing, we can directly deploy the policy in a real robot hand and rotate novel objects that are not presented in training.
arXiv Detail & Related papers (2023-03-20T05:38:30Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - On the Feasibility of Learning Finger-gaiting In-hand Manipulation with
Intrinsic Sensing [0.7373617024876725]
We use model-free reinforcement learning to learn finger-gaiting only via precision grasps.
To tackle the inherent instability of precision grasping, we propose the use of initial state distributions.
Our method can learn finger-gaiting with significantly improved sample complexity than the state-of-the-art.
arXiv Detail & Related papers (2021-09-26T23:22:29Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Physics-Based Dexterous Manipulations with Estimated Hand Poses and
Residual Reinforcement Learning [52.37106940303246]
We learn a model that maps noisy input hand poses to target virtual poses.
The agent is trained in a residual setting by using a model-free hybrid RL+IL approach.
We test our framework in two applications that use hand pose estimates for dexterous manipulations: hand-object interactions in VR and hand-object motion reconstruction in-the-wild.
arXiv Detail & Related papers (2020-08-07T17:34:28Z) - Design and Control of Roller Grasper V2 for In-Hand Manipulation [6.064252790182275]
We present a novel non-anthropomorphic robot grasper with the ability to manipulate objects by means of active surfaces at the fingertips.
Active surfaces are achieved by spherical rolling fingertips with two degrees of freedom (DoF)
A further DoF is in the base of each finger, allowing the fingers to grasp objects over a range of size and shapes.
arXiv Detail & Related papers (2020-04-18T00:54:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.