Elastic Tactile Simulation Towards Tactile-Visual Perception
- URL: http://arxiv.org/abs/2108.05013v2
- Date: Thu, 12 Aug 2021 09:54:27 GMT
- Title: Elastic Tactile Simulation Towards Tactile-Visual Perception
- Authors: Yikai Wang, Wenbing Huang, Bin Fang, Fuchun Sun, Chang Li
- Abstract summary: We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
- Score: 58.44106915440858
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile sensing plays an important role in robotic perception and
manipulation tasks. To overcome the real-world limitations of data collection,
simulating tactile response in a virtual environment comes as a desirable
direction of robotic research. In this paper, we propose Elastic Interaction of
Particles (EIP) for tactile simulation. Most existing works model the tactile
sensor as a rigid multi-body, which is incapable of reflecting the elastic
property of the tactile sensor as well as characterizing the fine-grained
physical interaction between the two objects. By contrast, EIP models the
tactile sensor as a group of coordinated particles, and the elastic property is
applied to regulate the deformation of particles during contact. With the
tactile simulation by EIP, we further propose a tactile-visual perception
network that enables information fusion between tactile data and visual images.
The perception network is based on a global-to-local fusion mechanism where
multi-scale tactile features are aggregated to the corresponding local region
of the visual modality with the guidance of tactile positions and directions.
The fusion method exhibits superiority regarding the 3D geometric
reconstruction task.
Related papers
- Dynamic Reconstruction of Hand-Object Interaction with Distributed Force-aware Contact Representation [52.36691633451968]
ViTaM-D is a visual-tactile framework for dynamic hand-object interaction reconstruction.
DF-Field is a distributed force-aware contact representation model.
Our results highlight the superior performance of ViTaM-D in both rigid and deformable object reconstruction.
arXiv Detail & Related papers (2024-11-14T16:29:45Z) - Controllable Visual-Tactile Synthesis [28.03469909285511]
We develop a conditional generative model that synthesizes both visual and tactile outputs from a single sketch.
We then introduce a pipeline to render high-quality visual and tactile outputs on an electroadhesion-based haptic device.
arXiv Detail & Related papers (2023-05-04T17:59:51Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Learning to Synthesize Volumetric Meshes from Vision-based Tactile
Imprints [26.118805500471066]
Vision-based tactile sensors typically utilize a deformable elastomer and a camera mounted above to provide high-resolution image observations of contacts.
This paper focuses on learning to synthesize the mesh of the elastomer based on the image imprints acquired from vision-based tactile sensors.
A graph neural network (GNN) is introduced to learn the image-to-mesh mappings with supervised learning.
arXiv Detail & Related papers (2022-03-29T00:24:10Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Learning Intuitive Physics with Multimodal Generative Models [24.342994226226786]
This paper presents a perception framework that fuses visual and tactile feedback to make predictions about the expected motion of objects in dynamic scenes.
We use a novel See-Through-your-Skin (STS) sensor that provides high resolution multimodal sensing of contact surfaces.
We validate through simulated and real-world experiments in which the resting state of an object is predicted from given initial conditions.
arXiv Detail & Related papers (2021-01-12T12:55:53Z) - Sim-to-real for high-resolution optical tactile sensing: From images to
3D contact force distributions [5.939410304994348]
This article proposes a strategy to generate tactile images in simulation for a vision-based tactile sensor based on an internal camera.
The deformation of the material is simulated in a finite element environment under a diverse set of contact conditions, and spherical particles are projected to a simulated image.
Features extracted from the images are mapped to the 3D contact force distribution, with the ground truth also obtained via finite-element simulations.
arXiv Detail & Related papers (2020-12-21T12:43:33Z) - Elastic Interaction of Particles for Robotic Tactile Simulation [43.933808122317274]
We propose Elastic Interaction of Particles (EIP), a novel framework for tactile emulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic theory is applied to regulate the deformation of particles during the contact process.
Experiments to verify the effectiveness of our method have been carried out on two applications.
arXiv Detail & Related papers (2020-11-23T16:37:00Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.