Neural Contact Fields: Tracking Extrinsic Contact with Tactile Sensing
- URL: http://arxiv.org/abs/2210.09297v1
- Date: Mon, 17 Oct 2022 17:52:43 GMT
- Title: Neural Contact Fields: Tracking Extrinsic Contact with Tactile Sensing
- Authors: Carolina Higuera, Siyuan Dong, Byron Boots, and Mustafa Mukadam
- Abstract summary: We present Neural Contact Fields, a method that brings together neural fields and tactile sensing to address the problem of tracking extrinsic contact between object and environment.
Knowing where the external contact occurs is a first step towards methods that can actively control it in facilitating downstream manipulation tasks.
- Score: 36.609644278386135
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present Neural Contact Fields, a method that brings together neural fields
and tactile sensing to address the problem of tracking extrinsic contact
between object and environment. Knowing where the external contact occurs is a
first step towards methods that can actively control it in facilitating
downstream manipulation tasks. Prior work for localizing environmental contacts
typically assume a contact type (e.g. point or line), does not capture
contact/no-contact transitions, and only works with basic geometric-shaped
objects. Neural Contact Fields are the first method that can track arbitrary
multi-modal extrinsic contacts without making any assumptions about the contact
type. Our key insight is to estimate the probability of contact for any 3D
point in the latent space of object shapes, given vision-based tactile inputs
that sense the local motion resulting from the external contact. In
experiments, we find that Neural Contact Fields are able to localize multiple
contact patches without making any assumptions about the geometry of the
contact, and capture contact/no-contact transitions for known categories of
objects with unseen shapes in unseen environment configurations. In addition to
Neural Contact Fields, we also release our YCB-Extrinsic-Contact dataset of
simulated extrinsic contact interactions to enable further research in this
area. Project repository: https://github.com/carolinahiguera/NCF
Related papers
- Contact-aware Human Motion Generation from Textual Descriptions [57.871692507044344]
This paper addresses the problem of generating 3D interactive human motion from text.
We create a novel dataset named RICH-CAT, representing "Contact-Aware Texts"
We propose a novel approach named CATMO for text-driven interactive human motion synthesis.
arXiv Detail & Related papers (2024-03-23T04:08:39Z) - ContactGen: Generative Contact Modeling for Grasp Generation [37.56729700157981]
This paper presents a novel object-centric contact representation ContactGen for hand-object interaction.
We propose a conditional generative model to predict ContactGen and adopt model-based optimization to predict diverse and geometrically feasible grasps.
arXiv Detail & Related papers (2023-10-05T17:59:45Z) - DECO: Dense Estimation of 3D Human-Scene Contact In The Wild [54.44345845842109]
We train a novel 3D contact detector that uses both body-part-driven and scene-context-driven attention to estimate contact on the SMPL body.
We significantly outperform existing SOTA methods across all benchmarks.
We also show qualitatively that DECO generalizes well to diverse and challenging real-world human interactions in natural images.
arXiv Detail & Related papers (2023-09-26T21:21:07Z) - Nonrigid Object Contact Estimation With Regional Unwrapping Transformer [16.988812837693203]
Acquiring contact patterns between hands and nonrigid objects is a common concern in the vision and robotics community.
Existing learning-based methods focus more on contact with rigid ones from monocular images.
We propose a novel hand-object contact representation called RUPs, which unwraps the roughly estimated hand-object surfaces as multiple high-resolution 2D regional profiles.
arXiv Detail & Related papers (2023-08-27T11:37:26Z) - Attention for Robot Touch: Tactile Saliency Prediction for Robust
Sim-to-Real Tactile Control [12.302685367517718]
High-resolution tactile sensing can provide accurate information about local contact in contact-rich robotic tasks.
We study a new concept: textittactile saliency for robot touch, inspired by the human touch attention mechanism from neuroscience.
arXiv Detail & Related papers (2023-07-26T21:19:45Z) - Learning Explicit Contact for Implicit Reconstruction of Hand-held
Objects from Monocular Images [59.49985837246644]
We show how to model contacts in an explicit way to benefit the implicit reconstruction of hand-held objects.
In the first part, we propose a new subtask of directly estimating 3D hand-object contacts from a single image.
In the second part, we introduce a novel method to diffuse estimated contact states from the hand mesh surface to nearby 3D space.
arXiv Detail & Related papers (2023-05-31T17:59:26Z) - Integrated Object Deformation and Contact Patch Estimation from
Visuo-Tactile Feedback [8.420670642409219]
We propose a representation that jointly models object deformations and contact patches from visuo-tactile feedback.
We propose a neural network architecture to learn a NDCF, and train it using simulated data.
We demonstrate that the learned NDCF transfers directly to the real-world without the need for fine-tuning.
arXiv Detail & Related papers (2023-05-23T18:53:24Z) - Contact-Aware Retargeting of Skinned Motion [49.71236739408685]
This paper introduces a motion estimation method that preserves self-contacts and prevents interpenetration.
The method identifies self-contacts and ground contacts in the input motion, and optimize the motion to apply to the output skeleton.
In experiments, our results quantitatively outperform previous methods and we conduct a user study where our retargeted motions are rated as higher-quality than those produced by recent works.
arXiv Detail & Related papers (2021-09-15T17:05:02Z) - Elastic Tactile Simulation Towards Tactile-Visual Perception [58.44106915440858]
We propose Elastic Interaction of Particles (EIP) for tactile simulation.
EIP models the tactile sensor as a group of coordinated particles, and the elastic property is applied to regulate the deformation of particles during contact.
We further propose a tactile-visual perception network that enables information fusion between tactile data and visual images.
arXiv Detail & Related papers (2021-08-11T03:49:59Z) - Tactile Object Pose Estimation from the First Touch with Geometric
Contact Rendering [19.69677059281393]
We present an approach to tactile pose estimation from the first touch for known objects.
We create an object-agnostic map from real tactile observations to contact shapes.
For a new object with known geometry, we learn a tailored perception model completely in simulation.
arXiv Detail & Related papers (2020-12-09T18:00:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.