Transferring Tactile Data Across Sensors
- URL: http://arxiv.org/abs/2410.14310v1
- Date: Fri, 18 Oct 2024 09:15:47 GMT
- Title: Transferring Tactile Data Across Sensors
- Authors: Wadhah Zai El Amri, Malte Kuhlmann, Nicolás Navarro-Guerrero,
- Abstract summary: This article introduces a novel method for translating data between tactile sensors.
We demonstrate the approach by translating BioTac signals into the DIGIT sensor.
Our framework consists of three steps: first, converting signal data into corresponding 3D deformation meshes; second, translating these 3D deformation meshes from one sensor to another; and third, generating output images.
- Score: 1.5566524830295307
- License:
- Abstract: Tactile perception is essential for human interaction with the environment and is becoming increasingly crucial in robotics. Tactile sensors like the BioTac mimic human fingertips and provide detailed interaction data. Despite its utility in applications like slip detection and object identification, this sensor is now deprecated, making many existing datasets obsolete. This article introduces a novel method for translating data between tactile sensors by exploiting sensor deformation information rather than output signals. We demonstrate the approach by translating BioTac signals into the DIGIT sensor. Our framework consists of three steps: first, converting signal data into corresponding 3D deformation meshes; second, translating these 3D deformation meshes from one sensor to another; and third, generating output images using the converted meshes. Our approach enables the continued use of valuable datasets.
Related papers
- ACROSS: A Deformation-Based Cross-Modal Representation for Robotic Tactile Perception [1.5566524830295307]
ACROSS is a framework for translating data between tactile sensors by exploiting sensor deformation information.
We demonstrate our approach to the most challenging problem of going from a low-dimensional tactile representation to a high-dimensional one.
arXiv Detail & Related papers (2024-11-13T11:29:14Z) - Transferable Tactile Transformers for Representation Learning Across Diverse Sensors and Tasks [6.742250322226066]
T3 is a framework for tactile representation learning that scales across multi-sensors and multi-tasks.
T3 pre-trained with FoTa achieved zero-shot transferability in certain sensor-task pairings.
T3 is also effective as a tactile encoder for long horizon contact-rich manipulation.
arXiv Detail & Related papers (2024-06-19T15:39:27Z) - UniTR: A Unified and Efficient Multi-Modal Transformer for
Bird's-Eye-View Representation [113.35352122662752]
We present an efficient multi-modal backbone for outdoor 3D perception named UniTR.
UniTR processes a variety of modalities with unified modeling and shared parameters.
UniTR is also a fundamentally task-agnostic backbone that naturally supports different 3D perception tasks.
arXiv Detail & Related papers (2023-08-15T12:13:44Z) - On the Importance of Accurate Geometry Data for Dense 3D Vision Tasks [61.74608497496841]
Training on inaccurate or corrupt data induces model bias and hampers generalisation capabilities.
This paper investigates the effect of sensor errors for the dense 3D vision tasks of depth estimation and reconstruction.
arXiv Detail & Related papers (2023-03-26T22:32:44Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Transformer-Based Sensor Fusion for Autonomous Driving: A Survey [0.0]
Transformers-based detection head and CNN-based feature encoder to extract features from raw sensor-data has emerged as one of the best performing sensor-fusion 3D-detection-framework.
We briefly go through the Vision transformers (ViT) basics, so that readers can easily follow through the paper.
In conclusion we summarize with sensor-fusion trends to follow and provoke future research.
arXiv Detail & Related papers (2023-02-22T16:28:20Z) - Learning Online Multi-Sensor Depth Fusion [100.84519175539378]
SenFuNet is a depth fusion approach that learns sensor-specific noise and outlier statistics.
We conduct experiments with various sensor combinations on the real-world CoRBS and Scene3D datasets.
arXiv Detail & Related papers (2022-04-07T10:45:32Z) - Bayesian Imitation Learning for End-to-End Mobile Manipulation [80.47771322489422]
Augmenting policies with additional sensor inputs, such as RGB + depth cameras, is a straightforward approach to improving robot perception capabilities.
We show that using the Variational Information Bottleneck to regularize convolutional neural networks improves generalization to held-out domains.
We demonstrate that our method is able to help close the sim-to-real gap and successfully fuse RGB and depth modalities.
arXiv Detail & Related papers (2022-02-15T17:38:30Z) - WaveGlove: Transformer-based hand gesture recognition using multiple
inertial sensors [0.0]
Hand Gesture Recognition (HGR) based on inertial data has grown considerably in recent years.
In this work we explore the benefits of using multiple inertial sensors.
arXiv Detail & Related papers (2021-05-04T20:50:53Z) - Proximity Sensing: Modeling and Understanding Noisy RSSI-BLE Signals and
Other Mobile Sensor Data for Digital Contact Tracing [12.070047847431884]
Social-distancing via efficient contact tracing has emerged as the primary health strategy to dampen the spread of COVID-19.
We present a novel system to estimate pair-wise individual proximity, via a joint model of Bluetooth Low Energy (BLE) signals with other on-device sensors.
arXiv Detail & Related papers (2020-09-04T03:01:52Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.