Rotating without Seeing: Towards In-hand Dexterity through Touch
- URL: http://arxiv.org/abs/2303.10880v4
- Date: Mon, 27 Mar 2023 11:52:20 GMT
- Title: Rotating without Seeing: Towards In-hand Dexterity through Touch
- Authors: Zhao-Heng Yin, Binghao Huang, Yuzhe Qin, Qifeng Chen, Xiaolong Wang
- Abstract summary: We present Touch Dexterity, a new system that can perform in-hand object rotation using only touching without seeing the object.
Instead of relying on precise tactile sensing in a small region, we introduce a new system design using dense binary force sensors (touch or no touch) overlaying one side of the whole robot hand.
We train an in-hand rotation policy using Reinforcement Learning on diverse objects in simulation. Relying on touch-only sensing, we can directly deploy the policy in a real robot hand and rotate novel objects that are not presented in training.
- Score: 43.87509744768282
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile information plays a critical role in human dexterity. It reveals
useful contact information that may not be inferred directly from vision. In
fact, humans can even perform in-hand dexterous manipulation without using
vision. Can we enable the same ability for the multi-finger robot hand? In this
paper, we present Touch Dexterity, a new system that can perform in-hand object
rotation using only touching without seeing the object. Instead of relying on
precise tactile sensing in a small region, we introduce a new system design
using dense binary force sensors (touch or no touch) overlaying one side of the
whole robot hand (palm, finger links, fingertips). Such a design is low-cost,
giving a larger coverage of the object, and minimizing the Sim2Real gap at the
same time. We train an in-hand rotation policy using Reinforcement Learning on
diverse objects in simulation. Relying on touch-only sensing, we can directly
deploy the policy in a real robot hand and rotate novel objects that are not
presented in training. Extensive ablations are performed on how tactile
information help in-hand manipulation.Our project is available at
https://touchdexterity.github.io.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Built Different: Tactile Perception to Overcome Cross-Embodiment Capability Differences in Collaborative Manipulation [1.9048510647598207]
Tactile sensing is a powerful means of implicit communication between a human and a robot assistant.
In this paper, we investigate how tactile sensing can transcend cross-embodiment differences across robotic systems.
We show how our method can enable a cooperative task where a robot and human must work together to maneuver objects through space.
arXiv Detail & Related papers (2024-09-23T10:45:41Z) - Learning In-Hand Translation Using Tactile Skin With Shear and Normal Force Sensing [43.269672740168396]
We introduce a sensor model for tactile skin that enables zero-shot sim-to-real transfer of ternary shear and binary normal forces.
We conduct extensive real-world experiments to assess how tactile sensing facilitates policy adaptation to various unseen object properties.
arXiv Detail & Related papers (2024-07-10T17:52:30Z) - AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch [9.606323817785114]
We present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch.
Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction.
Rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy.
arXiv Detail & Related papers (2024-05-12T22:51:35Z) - DexTouch: Learning to Seek and Manipulate Objects with Tactile Dexterity [11.450027373581019]
We introduce a multi-finger robot system designed to manipulate objects using the sense of touch, without relying on vision.
For tasks that mimic daily life, the robot uses its sense of touch to manipulate randomly placed objects in dark.
arXiv Detail & Related papers (2024-01-23T05:37:32Z) - Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - Robot Synesthesia: In-Hand Manipulation with Visuotactile Sensing [15.970078821894758]
We introduce a system that leverages visual and tactile sensory inputs to enable dexterous in-hand manipulation.
Robot Synesthesia is a novel point cloud-based tactile representation inspired by human tactile-visual synesthesia.
arXiv Detail & Related papers (2023-12-04T12:35:43Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - In-Hand Object Rotation via Rapid Motor Adaptation [59.59946962428837]
We show how to design and learn a simple adaptive controller to achieve in-hand object rotation using only fingertips.
The controller is trained entirely in simulation on only cylindrical objects.
It can be directly deployed to a real robot hand to rotate dozens of objects with diverse sizes, shapes, and weights over the z-axis.
arXiv Detail & Related papers (2022-10-10T17:58:45Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.