Tactile Image-to-Image Disentanglement of Contact Geometry from
Motion-Induced Shear
- URL: http://arxiv.org/abs/2109.03615v1
- Date: Wed, 8 Sep 2021 13:03:08 GMT
- Title: Tactile Image-to-Image Disentanglement of Contact Geometry from
Motion-Induced Shear
- Authors: Anupam K. Gupta, Laurence Aitchison, Nathan F. Lepora
- Abstract summary: Robotic touch, particularly when using soft optical tactile sensors, suffers from distortion caused by motion-dependent shear.
We propose a supervised convolutional deep neural network model that learns to disentangle, in the latent space, the components of sensor deformations caused by contact geometry from those due to sliding-induced shear.
- Score: 30.404840177562754
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Robotic touch, particularly when using soft optical tactile sensors, suffers
from distortion caused by motion-dependent shear. The manner in which the
sensor contacts a stimulus is entangled with the tactile information about the
geometry of the stimulus. In this work, we propose a supervised convolutional
deep neural network model that learns to disentangle, in the latent space, the
components of sensor deformations caused by contact geometry from those due to
sliding-induced shear. The approach is validated by reconstructing unsheared
tactile images from sheared images and showing they match unsheared tactile
images collected with no sliding motion. In addition, the unsheared tactile
images give a faithful reconstruction of the contact geometry that is not
possible from the sheared data, and robust estimation of the contact pose that
can be used for servo control sliding around various 2D shapes. Finally, the
contact geometry reconstruction in conjunction with servo control sliding were
used for faithful full object reconstruction of various 2D shapes. The methods
have broad applicability to deep learning models for robots with a
shear-sensitive sense of touch.
Related papers
- TouchSDF: A DeepSDF Approach for 3D Shape Reconstruction using
Vision-Based Tactile Sensing [29.691786688595762]
Humans rely on their visual and tactile senses to develop a comprehensive 3D understanding of their physical environment.
We propose TouchSDF, a Deep Learning approach for tactile 3D shape reconstruction.
Our technique consists of two components: (1) a Convolutional Neural Network that maps tactile images into local meshes representing the surface at the touch location, and (2) an implicit neural function that predicts a signed distance function to extract the desired 3D shape.
arXiv Detail & Related papers (2023-11-21T13:43:06Z) - Decaf: Monocular Deformation Capture for Face and Hand Interactions [77.75726740605748]
This paper introduces the first method that allows tracking human hands interacting with human faces in 3D from single monocular RGB videos.
We model hands as articulated objects inducing non-rigid face deformations during an active interaction.
Our method relies on a new hand-face motion and interaction capture dataset with realistic face deformations acquired with a markerless multi-view camera system.
arXiv Detail & Related papers (2023-09-28T17:59:51Z) - Learning Explicit Contact for Implicit Reconstruction of Hand-held
Objects from Monocular Images [59.49985837246644]
We show how to model contacts in an explicit way to benefit the implicit reconstruction of hand-held objects.
In the first part, we propose a new subtask of directly estimating 3D hand-object contacts from a single image.
In the second part, we introduce a novel method to diffuse estimated contact states from the hand mesh surface to nearby 3D space.
arXiv Detail & Related papers (2023-05-31T17:59:26Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Semi-Supervised Disentanglement of Tactile Contact~Geometry from
Sliding-Induced Shear [12.004939546183355]
The sense of touch is fundamental to human dexterity.
When mimicked in robotic touch, particularly by use of soft optical tactile sensors, it suffers from distortion due to motion-dependent shear.
In this work, we pursue a semi-supervised approach to remove shear while preserving contact-only information.
arXiv Detail & Related papers (2022-08-26T08:30:19Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Active 3D Shape Reconstruction from Vision and Touch [66.08432412497443]
Humans build 3D understandings of the world through active object exploration, using jointly their senses of vision and touch.
In 3D shape reconstruction, most recent progress has relied on static datasets of limited sensory data such as RGB images, depth maps or haptic readings.
We introduce a system composed of: 1) a haptic simulator leveraging high spatial resolution vision-based tactile sensors for active touching of 3D objects; 2) a mesh-based 3D shape reconstruction model that relies on tactile or visuotactile priors to guide the shape exploration; and 3) a set of data-driven solutions with either tactile or visuo
arXiv Detail & Related papers (2021-07-20T15:56:52Z) - GelSight Wedge: Measuring High-Resolution 3D Contact Geometry with a
Compact Robot Finger [8.047951969722794]
GelSight Wedge sensor is optimized to have a compact shape for robot fingers, while achieving high-resolution 3D reconstruction.
We show the effectiveness and potential of the reconstructed 3D geometry for pose tracking in the 3D space.
arXiv Detail & Related papers (2021-06-16T15:15:29Z) - Tactile Object Pose Estimation from the First Touch with Geometric
Contact Rendering [19.69677059281393]
We present an approach to tactile pose estimation from the first touch for known objects.
We create an object-agnostic map from real tactile observations to contact shapes.
For a new object with known geometry, we learn a tailored perception model completely in simulation.
arXiv Detail & Related papers (2020-12-09T18:00:35Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.