Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation
- URL: http://arxiv.org/abs/2312.01236v1
- Date: Sat, 2 Dec 2023 22:01:49 GMT
- Title: Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation
- Authors: Niklas Funk, Erik Helmut, Georgia Chalvatzaki, Roberto Calandra, Jan
Peters
- Abstract summary: Evetac is an event-based optical tactile sensor.
We develop touch processing algorithms to process its measurements online at 1000 Hz.
Evetac's output and the marker tracking provide meaningful features for learning data-driven slip detection and prediction models.
- Score: 21.94875601256614
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Optical tactile sensors have recently become popular. They provide high
spatial resolution, but struggle to offer fine temporal resolutions. To
overcome this shortcoming, we study the idea of replacing the RGB camera with
an event-based camera and introduce a new event-based optical tactile sensor
called Evetac. Along with hardware design, we develop touch processing
algorithms to process its measurements online at 1000 Hz. We devise an
efficient algorithm to track the elastomer's deformation through the imprinted
markers despite the sensor's sparse output. Benchmarking experiments
demonstrate Evetac's capabilities of sensing vibrations up to 498 Hz,
reconstructing shear forces, and significantly reducing data rates compared to
RGB optical tactile sensors. Moreover, Evetac's output and the marker tracking
provide meaningful features for learning data-driven slip detection and
prediction models. The learned models form the basis for a robust and adaptive
closed-loop grasp controller capable of handling a wide range of objects. We
believe that fast and efficient event-based tactile sensors like Evetac will be
essential for bringing human-like manipulation capabilities to robotics. The
sensor design is open-sourced at https://sites.google.com/view/evetac .
Related papers
- FeelAnyForce: Estimating Contact Force Feedback from Tactile Sensation for Vision-Based Tactile Sensors [18.88211706267447]
We tackle the problem of estimating 3D contact forces using vision-based tactile sensors.
Our goal is to estimate contact forces over a large range (up to 15 N) on any objects while generalizing across different vision-based tactile sensors.
arXiv Detail & Related papers (2024-10-02T21:28:19Z) - Multi-Modal Neural Radiance Field for Monocular Dense SLAM with a
Light-Weight ToF Sensor [58.305341034419136]
We present the first dense SLAM system with a monocular camera and a light-weight ToF sensor.
We propose a multi-modal implicit scene representation that supports rendering both the signals from the RGB camera and light-weight ToF sensor.
Experiments demonstrate that our system well exploits the signals of light-weight ToF sensors and achieves competitive results.
arXiv Detail & Related papers (2023-08-28T07:56:13Z) - PixelRNN: In-pixel Recurrent Neural Networks for End-to-end-optimized
Perception with Neural Sensors [42.18718773182277]
Conventional image sensors digitize high-resolution images at fast frame rates, producing a large amount of data that needs to be transmitted off the sensor for further processing.
We develop an efficient recurrent neural network architecture, processing PixelRNN, that encodes-temporal features on the sensor using purely binary operations.
PixelRNN reduces the amount data to be transmitted off the sensor by a factor of 64x compared to conventional systems while offering competitive accuracy for hand gesture recognition and lip reading tasks.
arXiv Detail & Related papers (2023-04-11T18:16:47Z) - Object Motion Sensitivity: A Bio-inspired Solution to the Ego-motion
Problem for Event-based Cameras [0.0]
We highlight the capability of the second generation of neuromorphic image sensors, Integrated Retinal Functionality in CMOS Image Sensors (IRIS)
IRIS aims to mimic full retinal computations from photoreceptors to output of the retina for targeted feature-extraction.
Our results show that OMS can accomplish standard computer vision tasks with similar efficiency to conventional RGB and DVS solutions but offers drastic bandwidth reduction.
arXiv Detail & Related papers (2023-03-24T16:22:06Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Bayesian Imitation Learning for End-to-End Mobile Manipulation [80.47771322489422]
Augmenting policies with additional sensor inputs, such as RGB + depth cameras, is a straightforward approach to improving robot perception capabilities.
We show that using the Variational Information Bottleneck to regularize convolutional neural networks improves generalization to held-out domains.
We demonstrate that our method is able to help close the sim-to-real gap and successfully fuse RGB and depth modalities.
arXiv Detail & Related papers (2022-02-15T17:38:30Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z) - Deep Soft Procrustes for Markerless Volumetric Sensor Alignment [81.13055566952221]
In this work, we improve markerless data-driven correspondence estimation to achieve more robust multi-sensor spatial alignment.
We incorporate geometric constraints in an end-to-end manner into a typical segmentation based model and bridge the intermediate dense classification task with the targeted pose estimation one.
Our model is experimentally shown to achieve similar results with marker-based methods and outperform the markerless ones, while also being robust to the pose variations of the calibration structure.
arXiv Detail & Related papers (2020-03-23T10:51:32Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.