Tactile Object Pose Estimation from the First Touch with Geometric
Contact Rendering
- URL: http://arxiv.org/abs/2012.05205v1
- Date: Wed, 9 Dec 2020 18:00:35 GMT
- Title: Tactile Object Pose Estimation from the First Touch with Geometric
Contact Rendering
- Authors: Maria Bauza, Eric Valls, Bryan Lim, Theo Sechopoulos, Alberto
Rodriguez
- Abstract summary: We present an approach to tactile pose estimation from the first touch for known objects.
We create an object-agnostic map from real tactile observations to contact shapes.
For a new object with known geometry, we learn a tailored perception model completely in simulation.
- Score: 19.69677059281393
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In this paper, we present an approach to tactile pose estimation from the
first touch for known objects. First, we create an object-agnostic map from
real tactile observations to contact shapes. Next, for a new object with known
geometry, we learn a tailored perception model completely in simulation. To do
so, we simulate the contact shapes that a dense set of object poses would
produce on the sensor. Then, given a new contact shape obtained from the sensor
output, we match it against the pre-computed set using the object-specific
embedding learned purely in simulation using contrastive learning.
This results in a perception model that can localize objects from a single
tactile observation. It also allows reasoning over pose distributions and
including additional pose constraints coming from other perception systems or
multiple contacts. We provide quantitative results for four objects. Our
approach provides high accuracy pose estimations from distinctive tactile
observations while regressing pose distributions to account for those contact
shapes that could result from different object poses. We further extend and
test our approach in multi-contact scenarios where several tactile sensors are
simultaneously in contact with the object. Website:
http://mcube.mit.edu/research/tactile_loc_first_touch.html
Related papers
- PseudoTouch: Efficiently Imaging the Surface Feel of Objects for Robotic Manipulation [8.997347199266592]
Our goal is to equip robots with a similar capability, which we term ourmodel.
We frame this problem as the task of learning a low-dimensional visual-tactile embedding.
Using ReSkin, we collect and train PseudoTouch on a dataset comprising aligned tactile and visual data pairs.
We demonstrate the efficacy of PseudoTouch through its application to two downstream tasks: object recognition and grasp stability prediction.
arXiv Detail & Related papers (2024-03-22T10:51:31Z) - Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - Learning Explicit Contact for Implicit Reconstruction of Hand-held
Objects from Monocular Images [59.49985837246644]
We show how to model contacts in an explicit way to benefit the implicit reconstruction of hand-held objects.
In the first part, we propose a new subtask of directly estimating 3D hand-object contacts from a single image.
In the second part, we introduce a novel method to diffuse estimated contact states from the hand mesh surface to nearby 3D space.
arXiv Detail & Related papers (2023-05-31T17:59:26Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Collision-aware In-hand 6D Object Pose Estimation using Multiple
Vision-based Tactile Sensors [4.886250215151643]
We reason on the possible spatial configurations of the sensors along the object surface.
We use selected sensors configurations to optimize over the space of 6D poses.
We rank the obtained poses by penalizing those that are in collision with the sensors.
arXiv Detail & Related papers (2023-01-31T14:35:26Z) - Tac2Pose: Tactile Object Pose Estimation from the First Touch [6.321662423735226]
We present Tac2Pose, an object-specific approach to tactile pose estimation from the first touch for known objects.
We simulate the contact shapes that a dense set of object poses would produce on the sensor.
We obtain contact shapes from the sensor with an object-agnostic calibration step that maps RGB tactile observations to binary contact shapes.
arXiv Detail & Related papers (2022-04-25T14:43:48Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Learning Intuitive Physics with Multimodal Generative Models [24.342994226226786]
This paper presents a perception framework that fuses visual and tactile feedback to make predictions about the expected motion of objects in dynamic scenes.
We use a novel See-Through-your-Skin (STS) sensor that provides high resolution multimodal sensing of contact surfaces.
We validate through simulated and real-world experiments in which the resting state of an object is predicted from given initial conditions.
arXiv Detail & Related papers (2021-01-12T12:55:53Z) - Learning Tactile Models for Factor Graph-based Estimation [24.958055047646628]
Vision-based tactile sensors provide rich, local image measurements at the point of contact.
A single measurement contains limited information and multiple measurements are needed to infer latent object state.
We propose a two-stage approach: first we learn local tactile observation models supervised with ground truth data, and then integrate these models along with physics and geometric factors within a factor graph.
arXiv Detail & Related papers (2020-12-07T15:09:31Z) - Continuous Surface Embeddings [76.86259029442624]
We focus on the task of learning and representing dense correspondences in deformable object categories.
We propose a new, learnable image-based representation of dense correspondences.
We demonstrate that the proposed approach performs on par or better than the state-of-the-art methods for dense pose estimation for humans.
arXiv Detail & Related papers (2020-11-24T22:52:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.