EgoPressure: A Dataset for Hand Pressure and Pose Estimation in Egocentric Vision
- URL: http://arxiv.org/abs/2409.02224v2
- Date: Wed, 04 Dec 2024 10:24:43 GMT
- Title: EgoPressure: A Dataset for Hand Pressure and Pose Estimation in Egocentric Vision
- Authors: Yiming Zhao, Taein Kwon, Paul Streli, Marc Pollefeys, Christian Holz,
- Abstract summary: EgoPressure is a novel egocentric dataset that captures detailed touch contact and pressure interactions.
Our dataset comprises 5 hours of recorded interactions from 21 participants captured simultaneously by one head-mounted and seven stationary Kinect cameras.
- Score: 69.1005706608681
- License:
- Abstract: Touch contact and pressure are essential for understanding how humans interact with and manipulate objects, insights which can significantly benefit applications in mixed reality and robotics. However, estimating these interactions from an egocentric camera perspective is challenging, largely due to the lack of comprehensive datasets that provide both accurate hand poses on contacting surfaces and detailed annotations of pressure information. In this paper, we introduce EgoPressure, a novel egocentric dataset that captures detailed touch contact and pressure interactions. EgoPressure provides high-resolution pressure intensity annotations for each contact point and includes accurate hand pose meshes obtained through our proposed multi-view, sequence-based optimization method processing data from an 8-camera capture rig. Our dataset comprises 5 hours of recorded interactions from 21 participants captured simultaneously by one head-mounted and seven stationary Kinect cameras, which acquire RGB images and depth maps at 30 Hz. To support future research and benchmarking, we present several baseline models for estimating applied pressure on external surfaces from RGB images, with and without hand pose information. We further explore the joint estimation of the hand mesh and applied pressure. Our experiments demonstrate that pressure and hand pose are complementary for understanding hand-object interactions. ng of hand-object interactions in AR/VR and robotics research. Project page: \url{https://yiming-zhao.github.io/EgoPressure/}.
Related papers
- Posture-Informed Muscular Force Learning for Robust Hand Pressure Estimation [6.912016522494431]
We present PiMForce, a novel framework that enhances hand pressure estimation.
Our approach utilizes detailed spatial information from 3D hand poses in conjunction with dynamic muscle activity from sEMG.
Our framework enables precise hand pressure estimation in complex and natural interaction scenarios.
arXiv Detail & Related papers (2024-10-31T04:42:43Z) - AugInsert: Learning Robust Visual-Force Policies via Data Augmentation for Object Assembly Tasks [7.631503105866245]
This paper primarily focuses on learning robust visual-force policies in the context of high-precision object assembly tasks.
We aim to learn contact-rich manipulation policies with multisensory inputs on limited expert data by expanding human demonstrations via online data augmentation.
arXiv Detail & Related papers (2024-10-19T04:19:52Z) - Benchmarks and Challenges in Pose Estimation for Egocentric Hand Interactions with Objects [89.95728475983263]
holistic 3Dunderstanding of such interactions from egocentric views is important for tasks in robotics, AR/VR, action recognition and motion generation.
We design the HANDS23 challenge based on the AssemblyHands and ARCTIC datasets with carefully designed training and testing splits.
Based on the results of the top submitted methods and more recent baselines on the leaderboards, we perform a thorough analysis on 3D hand(-object) reconstruction tasks.
arXiv Detail & Related papers (2024-03-25T05:12:21Z) - PressureVision++: Estimating Fingertip Pressure from Diverse RGB Images [23.877554759345607]
Deep models can estimate hand pressure based on a single RGB image.
We present a novel approach that enables diverse data to be captured with only an RGB camera and a cooperative participant.
We also demonstrate an application of PressureVision++ to mixed reality where pressure estimation allows everyday surfaces to be used as arbitrary touch-sensitive interfaces.
arXiv Detail & Related papers (2023-01-05T21:48:33Z) - PressureVision: Estimating Hand Pressure from a Single RGB Image [27.449311565446443]
We explore the possibility of using a conventional RGB camera to infer hand pressure.
We collected videos of 36 participants with diverse skin tone applying pressure to an instrumented planar surface.
We trained a deep model (PressureVisionNet) to infer a pressure image from a single RGB image.
arXiv Detail & Related papers (2022-03-19T19:54:56Z) - RGB2Hands: Real-Time Tracking of 3D Hand Interactions from Monocular RGB
Video [76.86512780916827]
We present the first real-time method for motion capture of skeletal pose and 3D surface geometry of hands from a single RGB camera.
In order to address the inherent depth ambiguities in RGB data, we propose a novel multi-task CNN.
We experimentally verify the individual components of our RGB two-hand tracking and 3D reconstruction pipeline.
arXiv Detail & Related papers (2021-06-22T12:53:56Z) - TRiPOD: Human Trajectory and Pose Dynamics Forecasting in the Wild [77.59069361196404]
TRiPOD is a novel method for predicting body dynamics based on graph attentional networks.
To incorporate a real-world challenge, we learn an indicator representing whether an estimated body joint is visible/invisible at each frame.
Our evaluation shows that TRiPOD outperforms all prior work and state-of-the-art specifically designed for each of the trajectory and pose forecasting tasks.
arXiv Detail & Related papers (2021-04-08T20:01:00Z) - Physics-Based Dexterous Manipulations with Estimated Hand Poses and
Residual Reinforcement Learning [52.37106940303246]
We learn a model that maps noisy input hand poses to target virtual poses.
The agent is trained in a residual setting by using a model-free hybrid RL+IL approach.
We test our framework in two applications that use hand pose estimates for dexterous manipulations: hand-object interactions in VR and hand-object motion reconstruction in-the-wild.
arXiv Detail & Related papers (2020-08-07T17:34:28Z) - Measuring Generalisation to Unseen Viewpoints, Articulations, Shapes and
Objects for 3D Hand Pose Estimation under Hand-Object Interaction [137.28465645405655]
HANDS'19 is a challenge to evaluate the abilities of current 3D hand pose estimators (HPEs) to interpolate and extrapolate the poses of a training set.
We show that the accuracy of state-of-the-art methods can drop, and that they fail mostly on poses absent from the training set.
arXiv Detail & Related papers (2020-03-30T19:28:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.