Tactile-Filter: Interactive Tactile Perception for Part Mating
- URL: http://arxiv.org/abs/2303.06034v2
- Date: Mon, 5 Jun 2023 13:44:02 GMT
- Title: Tactile-Filter: Interactive Tactile Perception for Part Mating
- Authors: Kei Ota, Devesh K. Jha, Hsiao-Yu Tung, Joshua B. Tenenbaum
- Abstract summary: Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
- Score: 54.46221808805662
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Humans rely on touch and tactile sensing for a lot of dexterous manipulation
tasks. Our tactile sensing provides us with a lot of information regarding
contact formations as well as geometric information about objects during any
interaction. With this motivation, vision-based tactile sensors are being
widely used for various robotic perception and control tasks. In this paper, we
present a method for interactive perception using vision-based tactile sensors
for a part mating task, where a robot can use tactile sensors and a feedback
mechanism using a particle filter to incrementally improve its estimate of
objects (pegs and holes) that fit together. To do this, we first train a deep
neural network that makes use of tactile images to predict the probabilistic
correspondence between arbitrarily shaped objects that fit together. The
trained model is used to design a particle filter which is used twofold. First,
given one partial (or non-unique) observation of the hole, it incrementally
improves the estimate of the correct peg by sampling more tactile observations.
Second, it selects the next action for the robot to sample the next touch (and
thus image) which results in maximum uncertainty reduction to minimize the
number of interactions during the perception task. We evaluate our method on
several part-mating tasks with novel objects using a robot equipped with a
vision-based tactile sensor. We also show the efficiency of the proposed action
selection method against a naive method. See supplementary video at
https://www.youtube.com/watch?v=jMVBg_e3gLw .
Related papers
- Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - A model-free approach to fingertip slip and disturbance detection for
grasp stability inference [0.0]
We propose a method for assessing grasp stability using tactile sensing.
We use highly sensitive Uskin tactile sensors mounted on an Allegro hand to test and validate our method.
arXiv Detail & Related papers (2023-11-22T09:04:26Z) - Combining Vision and Tactile Sensation for Video Prediction [0.0]
We investigate the impact of integrating tactile feedback into video prediction models for physical robot interactions.
We introduce two new datasets of robot pushing that use a magnetic-based tactile sensor for unsupervised learning.
Our results demonstrate that incorporating tactile feedback into video prediction models improves scene prediction accuracy and enhances the agent's perception of physical interactions.
arXiv Detail & Related papers (2023-04-21T18:02:15Z) - See, Hear, and Feel: Smart Sensory Fusion for Robotic Manipulation [49.925499720323806]
We study how visual, auditory, and tactile perception can jointly help robots to solve complex manipulation tasks.
We build a robot system that can see with a camera, hear with a contact microphone, and feel with a vision-based tactile sensor.
arXiv Detail & Related papers (2022-12-07T18:55:53Z) - Visual-Tactile Multimodality for Following Deformable Linear Objects
Using Reinforcement Learning [15.758583731036007]
We study the problem of using vision and tactile inputs together to complete the task of following deformable linear objects.
We create a Reinforcement Learning agent using different sensing modalities and investigate how its behaviour can be boosted.
Our experiments show that the use of both vision and tactile inputs, together with proprioception, allows the agent to complete the task in up to 92% of cases.
arXiv Detail & Related papers (2022-03-31T21:59:08Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile
Object Recognition [17.37142241982902]
New advances in flexible, event-driven, electronic skins may soon endow robots with touch perception capabilities similar to humans.
These unique features may render current deep learning approaches such as convolutional feature extractors unsuitable for tactile learning.
We propose a novel spiking graph neural network for event-based tactile object recognition.
arXiv Detail & Related papers (2020-08-01T03:35:15Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.