Tactile Estimation of Extrinsic Contact Patch for Stable Placement
- URL: http://arxiv.org/abs/2309.14552v2
- Date: Sat, 23 Mar 2024 14:15:02 GMT
- Title: Tactile Estimation of Extrinsic Contact Patch for Stable Placement
- Authors: Kei Ota, Devesh K. Jha, Krishna Murthy Jatavallabhula, Asako Kanezaki, Joshua B. Tenenbaum,
- Abstract summary: We present the design of feedback skills for robots that must learn to stack complex-shaped objects on top of each other.
We estimate the contact patch between a grasped object and its environment using force and tactile observations.
- Score: 64.06243248525823
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Precise perception of contact interactions is essential for fine-grained manipulation skills for robots. In this paper, we present the design of feedback skills for robots that must learn to stack complex-shaped objects on top of each other (see Fig.1). To design such a system, a robot should be able to reason about the stability of placement from very gentle contact interactions. Our results demonstrate that it is possible to infer the stability of object placement based on tactile readings during contact formation between the object and its environment. In particular, we estimate the contact patch between a grasped object and its environment using force and tactile observations to estimate the stability of the object during a contact formation. The contact patch could be used to estimate the stability of the object upon release of the grasp. The proposed method is demonstrated in various pairs of objects that are used in a very popular board game.
Related papers
- OOD-HOI: Text-Driven 3D Whole-Body Human-Object Interactions Generation Beyond Training Domains [66.62502882481373]
Current methods tend to focus either on the body or the hands, which limits their ability to produce cohesive and realistic interactions.
We propose OOD-HOI, a text-driven framework for generating whole-body human-object interactions that generalize well to new objects and actions.
Our approach integrates a dual-branch reciprocal diffusion model to synthesize initial interaction poses, a contact-guided interaction refiner to improve physical accuracy based on predicted contact areas, and a dynamic adaptation mechanism which includes semantic adjustment and geometry deformation to improve robustness.
arXiv Detail & Related papers (2024-11-27T10:13:35Z) - Learning Object Properties Using Robot Proprioception via Differentiable Robot-Object Interaction [52.12746368727368]
Differentiable simulation has become a powerful tool for system identification.
Our approach calibrates object properties by using information from the robot, without relying on data from the object itself.
We demonstrate the effectiveness of our method on a low-cost robotic platform.
arXiv Detail & Related papers (2024-10-04T20:48:38Z) - Tactile-Filter: Interactive Tactile Perception for Part Mating [54.46221808805662]
Humans rely on touch and tactile sensing for a lot of dexterous manipulation tasks.
vision-based tactile sensors are being widely used for various robotic perception and control tasks.
We present a method for interactive perception using vision-based tactile sensors for a part mating task.
arXiv Detail & Related papers (2023-03-10T16:27:37Z) - Planning Visual-Tactile Precision Grasps via Complementary Use of Vision
and Touch [9.31776719215139]
We propose an approach to grasp planning that explicitly reasons about where the fingertips should contact the estimated object surface.
Key to our method's success is the use of visual surface estimation for initial planning to encode the contact constraint.
We show that our method successfully synthesises and executes precision grasps for previously unseen objects using surface estimates from a single camera view.
arXiv Detail & Related papers (2022-12-16T17:32:56Z) - Robust Contact State Estimation in Humanoid Walking Gaits [3.1866319932300953]
We propose a deep learning framework that provides a unified approach to the problem of leg contact detection in humanoid robot walking gaits.
Our formulation accomplishes to accurately and robustly estimate the contact state probability for each leg.
Our implementation is offered as an open-source ROS/Python package, coined Legged Contact Detection (LCD)
arXiv Detail & Related papers (2022-07-30T17:19:47Z) - COUCH: Towards Controllable Human-Chair Interactions [44.66450508317131]
We study the problem of synthesizing scene interactions conditioned on different contact positions on the object.
We propose a novel synthesis framework COUCH that plans ahead the motion by predicting contact-aware control signals of the hands.
Our method shows significant quantitative and qualitative improvements over existing methods for human-object interactions.
arXiv Detail & Related papers (2022-05-01T19:14:22Z) - Dynamic Modeling of Hand-Object Interactions via Tactile Sensing [133.52375730875696]
In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects.
We build our model on a cross-modal learning framework and generate the labels using a visual processing pipeline to supervise the tactile model.
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing.
arXiv Detail & Related papers (2021-09-09T16:04:14Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - OmniHang: Learning to Hang Arbitrary Objects using Contact Point
Correspondences and Neural Collision Estimation [14.989379991558046]
We propose a system that takes partial point clouds of an object and a supporting item as input and learns to decide where and how to hang the object stably.
Our system learns to estimate the contact point correspondences between the object and supporting item to get an estimated stable pose.
Then, the robot needs to find a collision-free path to move the object from its initial pose to stable hanging pose.
arXiv Detail & Related papers (2021-03-26T06:11:05Z) - Object Detection and Pose Estimation from RGB and Depth Data for
Real-time, Adaptive Robotic Grasping [0.0]
We propose a system that performs real-time object detection and pose estimation, for the purpose of dynamic robot grasping.
The proposed approach allows the robot to detect the object identity and its actual pose, and then adapt a canonical grasp in order to be used with the new pose.
For training, the system defines a canonical grasp by capturing the relative pose of an object with respect to the gripper attached to the robot's wrist.
During testing, once a new pose is detected, a canonical grasp for the object is identified and then dynamically adapted by adjusting the robot arm's joint angles.
arXiv Detail & Related papers (2021-01-18T22:22:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.