Imagine2touch: Predictive Tactile Sensing for Robotic Manipulation using Efficient Low-Dimensional Signals
- URL: http://arxiv.org/abs/2405.01192v1
- Date: Thu, 2 May 2024 11:33:54 GMT
- Title: Imagine2touch: Predictive Tactile Sensing for Robotic Manipulation using Efficient Low-Dimensional Signals
- Authors: Abdallah Ayad, Adrian Röfer, Nick Heppert, Abhinav Valada,
- Abstract summary: Imagine2touch aims to predict the expected touch signal based on a visual patch representing the area to be touched.
We use ReSkin, an inexpensive and compact touch sensor to collect the required dataset.
Imagine2touch achieves an object recognition accuracy of 58% after ten touches per object, surpassing a proprioception baseline.
- Score: 9.202784204187878
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Humans seemingly incorporate potential touch signals in their perception. Our goal is to equip robots with a similar capability, which we term Imagine2touch. Imagine2touch aims to predict the expected touch signal based on a visual patch representing the area to be touched. We use ReSkin, an inexpensive and compact touch sensor to collect the required dataset through random touching of five basic geometric shapes, and one tool. We train Imagine2touch on two out of those shapes and validate it on the ood. tool. We demonstrate the efficacy of Imagine2touch through its application to the downstream task of object recognition. In this task, we evaluate Imagine2touch performance in two experiments, together comprising 5 out of training distribution objects. Imagine2touch achieves an object recognition accuracy of 58% after ten touches per object, surpassing a proprioception baseline.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - PseudoTouch: Efficiently Imaging the Surface Feel of Objects for Robotic Manipulation [8.997347199266592]
Our goal is to equip robots with a similar capability, which we term ourmodel.
We frame this problem as the task of learning a low-dimensional visual-tactile embedding.
Using ReSkin, we collect and train PseudoTouch on a dataset comprising aligned tactile and visual data pairs.
We demonstrate the efficacy of PseudoTouch through its application to two downstream tasks: object recognition and grasp stability prediction.
arXiv Detail & Related papers (2024-03-22T10:51:31Z) - Binding Touch to Everything: Learning Unified Multimodal Tactile
Representations [29.76008953177392]
We introduce UniTouch, a unified model for vision-based touch sensors connected to multiple modalities.
We achieve this by aligning our UniTouch embeddings to pretrained image embeddings already associated with a variety of other modalities.
We further propose learnable sensor-specific tokens, allowing the model to learn from a set of heterogeneous tactile sensors.
arXiv Detail & Related papers (2024-01-31T18:59:57Z) - Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - Rotating without Seeing: Towards In-hand Dexterity through Touch [43.87509744768282]
We present Touch Dexterity, a new system that can perform in-hand object rotation using only touching without seeing the object.
Instead of relying on precise tactile sensing in a small region, we introduce a new system design using dense binary force sensors (touch or no touch) overlaying one side of the whole robot hand.
We train an in-hand rotation policy using Reinforcement Learning on diverse objects in simulation. Relying on touch-only sensing, we can directly deploy the policy in a real robot hand and rotate novel objects that are not presented in training.
arXiv Detail & Related papers (2023-03-20T05:38:30Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Touch and Go: Learning from Human-Collected Vision and Touch [16.139106833276]
We propose a dataset with paired visual and tactile data called Touch and Go.
Human data collectors probe objects in natural environments using tactile sensors.
Our dataset spans a large number of "in the wild" objects and scenes.
arXiv Detail & Related papers (2022-11-22T18:59:32Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - PyTouch: A Machine Learning Library for Touch Processing [68.32055581488557]
We present PyTouch, the first machine learning library dedicated to the processing of touch sensing signals.
PyTouch is designed to be modular, easy-to-use and provides state-of-the-art touch processing capabilities as a service.
We evaluate PyTouch on real-world data from several tactile sensors on touch processing tasks such as touch detection, slip and object pose estimations.
arXiv Detail & Related papers (2021-05-26T18:55:18Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.