Elastic Interaction of Particles for Robotic Tactile Simulation
- URL: http://arxiv.org/abs/2011.11528v1
- Date: Mon, 23 Nov 2020 16:37:00 GMT
- Title: Elastic Interaction of Particles for Robotic Tactile Simulation
- Authors: Yikai Wang, Wenbing Huang, Bin Fang, Fuchun Sun
- Abstract summary: We propose Elastic Interaction of Particles (EIP), a novel framework for tactile emulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic theory is applied to regulate the deformation of particles during the contact process.
Experiments to verify the effectiveness of our method have been carried out on two applications.
- Score: 43.933808122317274
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile sensing plays an important role in robotic perception and
manipulation. To overcome the real-world limitations of data collection,
simulating tactile response in virtual environment comes as a desire direction
of robotic research. Most existing works model the tactile sensor as a rigid
multi-body, which is incapable of reflecting the elastic property of the
tactile sensor as well as characterizing the fine-grained physical interaction
between two objects. In this paper, we propose Elastic Interaction of Particles
(EIP), a novel framework for tactile emulation. At its core, EIP models the
tactile sensor as a group of coordinated particles, and the elastic theory is
applied to regulate the deformation of particles during the contact process.
The implementation of EIP is conducted from scratch, without resorting to any
existing physics engine. Experiments to verify the effectiveness of our method
have been carried out on two applications: robotic perception with tactile data
and 3D geometric reconstruction by tactile-visual fusion. It is possible to
open up a new vein for robotic tactile simulation, and contribute to various
downstream robotic tasks.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Combining Vision and Tactile Sensation for Video Prediction [0.0]
We investigate the impact of integrating tactile feedback into video prediction models for physical robot interactions.
We introduce two new datasets of robot pushing that use a magnetic-based tactile sensor for unsupervised learning.
Our results demonstrate that incorporating tactile feedback into video prediction models improves scene prediction accuracy and enhances the agent's perception of physical interactions.
arXiv Detail & Related papers (2023-04-21T18:02:15Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - RoboCraft: Learning to See, Simulate, and Shape Elasto-Plastic Objects
with Graph Networks [32.00371492516123]
We present a model-based planning framework for modeling and manipulating elasto-plastic objects.
Our system, RoboCraft, learns a particle-based dynamics model using graph neural networks (GNNs) to capture the structure of the underlying system.
We show through experiments that with just 10 minutes of real-world robotic interaction data, our robot can learn a dynamics model that can be used to synthesize control signals to deform elasto-plastic objects into various target shapes.
arXiv Detail & Related papers (2022-05-05T20:28:15Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Learning Intuitive Physics with Multimodal Generative Models [24.342994226226786]
This paper presents a perception framework that fuses visual and tactile feedback to make predictions about the expected motion of objects in dynamic scenes.
We use a novel See-Through-your-Skin (STS) sensor that provides high resolution multimodal sensing of contact surfaces.
We validate through simulated and real-world experiments in which the resting state of an object is predicted from given initial conditions.
arXiv Detail & Related papers (2021-01-12T12:55:53Z) - ThreeDWorld: A Platform for Interactive Multi-Modal Physical Simulation [75.0278287071591]
ThreeDWorld (TDW) is a platform for interactive multi-modal physical simulation.
TDW enables simulation of high-fidelity sensory data and physical interactions between mobile agents and objects in rich 3D environments.
We present initial experiments enabled by TDW in emerging research directions in computer vision, machine learning, and cognitive science.
arXiv Detail & Related papers (2020-07-09T17:33:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.