TactoFind: A Tactile Only System for Object Retrieval
- URL: http://arxiv.org/abs/2303.13482v1
- Date: Thu, 23 Mar 2023 17:50:09 GMT
- Title: TactoFind: A Tactile Only System for Object Retrieval
- Authors: Sameer Pai, Tao Chen, Megha Tippur, Edward Adelson, Abhishek Gupta,
Pulkit Agrawal
- Abstract summary: We study the problem of object retrieval in scenarios where visual sensing is absent.
Unlike vision, where cameras can observe the entire scene, touch sensors are local and only observe parts of the scene that are in contact with the manipulator.
We present a system capable of using sparse tactile feedback from fingertip touch sensors on a dexterous hand to localize, identify and grasp novel objects without any visual feedback.
- Score: 14.732140705441992
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the problem of object retrieval in scenarios where visual sensing is
absent, object shapes are unknown beforehand and objects can move freely, like
grabbing objects out of a drawer. Successful solutions require localizing free
objects, identifying specific object instances, and then grasping the
identified objects, only using touch feedback. Unlike vision, where cameras can
observe the entire scene, touch sensors are local and only observe parts of the
scene that are in contact with the manipulator. Moreover, information gathering
via touch sensors necessitates applying forces on the touched surface which may
disturb the scene itself. Reasoning with touch, therefore, requires careful
exploration and integration of information over time -- a challenge we tackle.
We present a system capable of using sparse tactile feedback from fingertip
touch sensors on a dexterous hand to localize, identify and grasp novel objects
without any visual feedback. Videos are available at
https://taochenshh.github.io/projects/tactofind.
Related papers
- Tactile-based Object Retrieval From Granular Media [17.340244278653785]
We introduce GEOTACT, a robotic manipulation method capable of retrieving objects buried in granular media.
We show that our problem formulation leads to the natural emergence of learned pushing behaviors that the manipulator uses to reduce uncertainty.
We also introduce a training curriculum that enables learning these behaviors in simulation, followed by zero-shot transfer to real hardware.
arXiv Detail & Related papers (2024-02-07T02:50:56Z) - Attention for Robot Touch: Tactile Saliency Prediction for Robust
Sim-to-Real Tactile Control [12.302685367517718]
High-resolution tactile sensing can provide accurate information about local contact in contact-rich robotic tasks.
We study a new concept: textittactile saliency for robot touch, inspired by the human touch attention mechanism from neuroscience.
arXiv Detail & Related papers (2023-07-26T21:19:45Z) - Learning Explicit Contact for Implicit Reconstruction of Hand-held
Objects from Monocular Images [59.49985837246644]
We show how to model contacts in an explicit way to benefit the implicit reconstruction of hand-held objects.
In the first part, we propose a new subtask of directly estimating 3D hand-object contacts from a single image.
In the second part, we introduce a novel method to diffuse estimated contact states from the hand mesh surface to nearby 3D space.
arXiv Detail & Related papers (2023-05-31T17:59:26Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Tac2Pose: Tactile Object Pose Estimation from the First Touch [6.321662423735226]
We present Tac2Pose, an object-specific approach to tactile pose estimation from the first touch for known objects.
We simulate the contact shapes that a dense set of object poses would produce on the sensor.
We obtain contact shapes from the sensor with an object-agnostic calibration step that maps RGB tactile observations to binary contact shapes.
arXiv Detail & Related papers (2022-04-25T14:43:48Z) - HyperDet3D: Learning a Scene-conditioned 3D Object Detector [154.84798451437032]
We propose HyperDet3D to explore scene-conditioned prior knowledge for 3D object detection.
Our HyperDet3D achieves state-of-the-art results on the 3D object detection benchmark of the ScanNet and SUN RGB-D datasets.
arXiv Detail & Related papers (2022-04-12T07:57:58Z) - Discovering Objects that Can Move [55.743225595012966]
We study the problem of object discovery -- separating objects from the background without manual labels.
Existing approaches utilize appearance cues, such as color, texture, and location, to group pixels into object-like regions.
We choose to focus on dynamic objects -- entities that can move independently in the world.
arXiv Detail & Related papers (2022-03-18T21:13:56Z) - EagerMOT: 3D Multi-Object Tracking via Sensor Fusion [68.8204255655161]
Multi-object tracking (MOT) enables mobile robots to perform well-informed motion planning and navigation by localizing surrounding objects in 3D space and time.
Existing methods rely on depth sensors (e.g., LiDAR) to detect and track targets in 3D space, but only up to a limited sensing range due to the sparsity of the signal.
We propose EagerMOT, a simple tracking formulation that integrates all available object observations from both sensor modalities to obtain a well-informed interpretation of the scene dynamics.
arXiv Detail & Related papers (2021-04-29T22:30:29Z) - Visiting the Invisible: Layer-by-Layer Completed Scene Decomposition [57.088328223220934]
Existing scene understanding systems mainly focus on recognizing the visible parts of a scene, ignoring the intact appearance of physical objects in the real-world.
In this work, we propose a higher-level scene understanding system to tackle both visible and invisible parts of objects and backgrounds in a given scene.
arXiv Detail & Related papers (2021-04-12T11:37:23Z) - Tactile Object Pose Estimation from the First Touch with Geometric
Contact Rendering [19.69677059281393]
We present an approach to tactile pose estimation from the first touch for known objects.
We create an object-agnostic map from real tactile observations to contact shapes.
For a new object with known geometry, we learn a tailored perception model completely in simulation.
arXiv Detail & Related papers (2020-12-09T18:00:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.