PressureVision: Estimating Hand Pressure from a Single RGB Image
- URL: http://arxiv.org/abs/2203.10385v1
- Date: Sat, 19 Mar 2022 19:54:56 GMT
- Title: PressureVision: Estimating Hand Pressure from a Single RGB Image
- Authors: Patrick Grady, Chengcheng Tang, Samarth Brahmbhatt, Christopher D.
Twigg, Chengde Wan, James Hays, Charles C. Kemp
- Abstract summary: We explore the possibility of using a conventional RGB camera to infer hand pressure.
We collected videos of 36 participants with diverse skin tone applying pressure to an instrumented planar surface.
We trained a deep model (PressureVisionNet) to infer a pressure image from a single RGB image.
- Score: 27.449311565446443
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: People often interact with their surroundings by applying pressure with their
hands. Machine perception of hand pressure has been limited by the challenges
of placing sensors between the hand and the contact surface. We explore the
possibility of using a conventional RGB camera to infer hand pressure. The
central insight is that the application of pressure by a hand results in
informative appearance changes. Hands share biomechanical properties that
result in similar observable phenomena, such as soft-tissue deformation, blood
distribution, hand pose, and cast shadows. We collected videos of 36
participants with diverse skin tone applying pressure to an instrumented planar
surface. We then trained a deep model (PressureVisionNet) to infer a pressure
image from a single RGB image. Our model infers pressure for participants
outside of the training data and outperforms baselines. We also show that the
output of our model depends on the appearance of the hand and cast shadows near
contact regions. Overall, our results suggest the appearance of a previously
unobserved human hand can be used to accurately infer applied pressure.
Related papers
- EgoPressure: A Dataset for Hand Pressure and Pose Estimation in Egocentric Vision [69.1005706608681]
We introduce EgoPressure, a novel dataset of touch contact and pressure interaction from an egocentric perspective.
EgoPressure comprises 5.0 hours of touch contact and pressure interaction from 21 participants captured by a moving egocentric camera and 7 stationary Kinect cameras.
arXiv Detail & Related papers (2024-09-03T18:53:32Z) - HandNeRF: Neural Radiance Fields for Animatable Interacting Hands [122.32855646927013]
We propose a novel framework to reconstruct accurate appearance and geometry with neural radiance fields (NeRF) for interacting hands.
We conduct extensive experiments to verify the merits of our proposed HandNeRF and report a series of state-of-the-art results.
arXiv Detail & Related papers (2023-03-24T06:19:19Z) - Deformer: Dynamic Fusion Transformer for Robust Hand Pose Estimation [59.3035531612715]
Existing methods often struggle to generate plausible hand poses when the hand is heavily occluded or blurred.
In videos, the movements of the hand allow us to observe various parts of the hand that may be occluded or blurred in a single frame.
We propose the Deformer: a framework that implicitly reasons about the relationship between hand parts within the same image.
arXiv Detail & Related papers (2023-03-09T02:24:30Z) - Robustness Evaluation in Hand Pose Estimation Models using Metamorphic
Testing [2.535271349350579]
Hand pose estimation (HPE) is a task that predicts and describes the hand poses from images or video frames.
In this work, we adopt metamorphic testing to evaluate the robustness of HPE models.
arXiv Detail & Related papers (2023-03-08T13:23:53Z) - PressureVision++: Estimating Fingertip Pressure from Diverse RGB Images [23.877554759345607]
Deep models can estimate hand pressure based on a single RGB image.
We present a novel approach that enables diverse data to be captured with only an RGB camera and a cooperative participant.
We also demonstrate an application of PressureVision++ to mixed reality where pressure estimation allows everyday surfaces to be used as arbitrary touch-sensitive interfaces.
arXiv Detail & Related papers (2023-01-05T21:48:33Z) - Pressure Eye: In-bed Contact Pressure Estimation via Contact-less
Imaging [18.35652911833834]
We present our pressure eye (PEye) approach to estimate contact pressure between a human body and the surface she is lying on.
PEye could ultimately enable the prediction and early detection of pressure ulcers in bed-bound patients.
arXiv Detail & Related papers (2022-01-27T22:22:17Z) - Learning to Disambiguate Strongly Interacting Hands via Probabilistic
Per-pixel Part Segmentation [84.28064034301445]
Self-similarity, and the resulting ambiguities in assigning pixel observations to the respective hands, is a major cause of the final 3D pose error.
We propose DIGIT, a novel method for estimating the 3D poses of two interacting hands from a single monocular image.
We experimentally show that the proposed approach achieves new state-of-the-art performance on the InterHand2.6M dataset.
arXiv Detail & Related papers (2021-07-01T13:28:02Z) - RGB2Hands: Real-Time Tracking of 3D Hand Interactions from Monocular RGB
Video [76.86512780916827]
We present the first real-time method for motion capture of skeletal pose and 3D surface geometry of hands from a single RGB camera.
In order to address the inherent depth ambiguities in RGB data, we propose a novel multi-task CNN.
We experimentally verify the individual components of our RGB two-hand tracking and 3D reconstruction pipeline.
arXiv Detail & Related papers (2021-06-22T12:53:56Z) - Measuring Generalisation to Unseen Viewpoints, Articulations, Shapes and
Objects for 3D Hand Pose Estimation under Hand-Object Interaction [137.28465645405655]
HANDS'19 is a challenge to evaluate the abilities of current 3D hand pose estimators (HPEs) to interpolate and extrapolate the poses of a training set.
We show that the accuracy of state-of-the-art methods can drop, and that they fail mostly on poses absent from the training set.
arXiv Detail & Related papers (2020-03-30T19:28:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.