Hand tracking for clinical applications: validation of the Google
MediaPipe Hand (GMH) and the depth-enhanced GMH-D frameworks
- URL: http://arxiv.org/abs/2308.01088v1
- Date: Wed, 2 Aug 2023 11:44:49 GMT
- Title: Hand tracking for clinical applications: validation of the Google
MediaPipe Hand (GMH) and the depth-enhanced GMH-D frameworks
- Authors: Gianluca Amprimo, Giulia Masi, Giuseppe Pettiti, Gabriella Olmo,
Lorenzo Priano and Claudia Ferraris
- Abstract summary: The aim is to validate the handtracking framework implemented by Google MediaPipe Hand (GMH) and an innovative enhanced version, GMH-D.
Three dynamic exercises commonly administered by clinicians to assess hand dysfunctions are considered.
Results demonstrate high temporal and spectral consistency of both frameworks with the gold standard.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurate 3D tracking of hand and fingers movements poses significant
challenges in computer vision. The potential applications span across multiple
domains, including human-computer interaction, virtual reality, industry, and
medicine. While gesture recognition has achieved remarkable accuracy,
quantifying fine movements remains a hurdle, particularly in clinical
applications where the assessment of hand dysfunctions and rehabilitation
training outcomes necessitate precise measurements. Several novel and
lightweight frameworks based on Deep Learning have emerged to address this
issue; however, their performance in accurately and reliably measuring fingers
movements requires validation against well-established gold standard systems.
In this paper, the aim is to validate the handtracking framework implemented by
Google MediaPipe Hand (GMH) and an innovative enhanced version, GMH-D, that
exploits the depth estimation of an RGB-Depth camera to achieve more accurate
tracking of 3D movements. Three dynamic exercises commonly administered by
clinicians to assess hand dysfunctions, namely Hand Opening-Closing, Single
Finger Tapping and Multiple Finger Tapping are considered. Results demonstrate
high temporal and spectral consistency of both frameworks with the gold
standard. However, the enhanced GMH-D framework exhibits superior accuracy in
spatial measurements compared to the baseline GMH, for both slow and fast
movements. Overall, our study contributes to the advancement of hand tracking
technology, the establishment of a validation procedure as a good-practice to
prove efficacy of deep-learning-based hand-tracking, and proves the
effectiveness of GMH-D as a reliable framework for assessing 3D hand movements
in clinical applications.
Related papers
- SLAM assisted 3D tracking system for laparoscopic surgery [22.36252790404779]
This work proposes a real-time monocular 3D tracking algorithm for post-registration tasks.
Experiments from in-vivo and ex-vivo tests demonstrate that the proposed 3D tracking system provides robust 3D tracking.
arXiv Detail & Related papers (2024-09-18T04:00:54Z) - Deep Reinforcement Learning Empowered Activity-Aware Dynamic Health
Monitoring Systems [69.41229290253605]
Existing monitoring approaches were designed on the premise that medical devices track several health metrics concurrently.
This means that they report all relevant health values within that scope, which can result in excess resource use and the gathering of extraneous data.
We propose Dynamic Activity-Aware Health Monitoring strategy (DActAHM) for striking a balance between optimal monitoring performance and cost efficiency.
arXiv Detail & Related papers (2024-01-19T16:26:35Z) - HMP: Hand Motion Priors for Pose and Shape Estimation from Video [52.39020275278984]
We develop a generative motion prior specific for hands, trained on the AMASS dataset which features diverse and high-quality hand motions.
Our integration of a robust motion prior significantly enhances performance, especially in occluded scenarios.
We demonstrate our method's efficacy via qualitative and quantitative evaluations on the HO3D and DexYCB datasets.
arXiv Detail & Related papers (2023-12-27T22:35:33Z) - A webcam-based machine learning approach for three-dimensional range of
motion evaluation [5.520419627866446]
Joint range of motion (ROM) is an important quantitative measure for physical therapy.
The current study presents and evaluates an alternative machine learning-based ROM evaluation method that could be remotely accessed via a webcam.
arXiv Detail & Related papers (2023-10-11T09:12:42Z) - ShaRPy: Shape Reconstruction and Hand Pose Estimation from RGB-D with
Uncertainty [6.559796851992517]
We propose ShaRPy, the first RGB-D Shape Reconstruction and hand Pose tracking system.
ShaRPy approximates a personalized hand shape, promoting a more realistic and intuitive understanding of its digital twin.
We evaluate ShaRPy on a keypoint detection benchmark and show qualitative results of hand function assessments for activity monitoring of musculoskeletal diseases.
arXiv Detail & Related papers (2023-03-17T15:12:25Z) - Simultaneous Estimation of Hand Configurations and Finger Joint Angles
using Forearm Ultrasound [8.753262480814493]
Forearm ultrasound images provide a musculoskeletal visualization that can be used to understand hand motion.
We propose a CNN based deep learning pipeline for predicting the MCP joint angles.
A low latency pipeline has been proposed for estimating both MCP joint angles and hand configuration aimed at real-time control of human-machine interfaces.
arXiv Detail & Related papers (2022-11-29T02:06:19Z) - Monocular 3D Reconstruction of Interacting Hands via Collision-Aware
Factorized Refinements [96.40125818594952]
We make the first attempt to reconstruct 3D interacting hands from monocular single RGB images.
Our method can generate 3D hand meshes with both precise 3D poses and minimal collisions.
arXiv Detail & Related papers (2021-11-01T08:24:10Z) - A Skeleton-Driven Neural Occupancy Representation for Articulated Hands [49.956892429789775]
Hand ArticuLated Occupancy (HALO) is a novel representation of articulated hands that bridges the advantages of 3D keypoints and neural implicit surfaces.
We demonstrate the applicability of HALO to the task of conditional generation of hands that grasp 3D objects.
arXiv Detail & Related papers (2021-09-23T14:35:19Z) - Temporally Guided Articulated Hand Pose Tracking in Surgical Videos [22.752654546694334]
Articulated hand pose tracking is an under-explored problem that carries the potential for use in an extensive number of applications.
We propose a novel hand pose estimation model, CondPose, which improves detection and tracking accuracy by incorporating a pose prior to its prediction.
arXiv Detail & Related papers (2021-01-12T03:44:04Z) - Physics-Based Dexterous Manipulations with Estimated Hand Poses and
Residual Reinforcement Learning [52.37106940303246]
We learn a model that maps noisy input hand poses to target virtual poses.
The agent is trained in a residual setting by using a model-free hybrid RL+IL approach.
We test our framework in two applications that use hand pose estimates for dexterous manipulations: hand-object interactions in VR and hand-object motion reconstruction in-the-wild.
arXiv Detail & Related papers (2020-08-07T17:34:28Z) - AutoHR: A Strong End-to-end Baseline for Remote Heart Rate Measurement
with Neural Searching [76.4844593082362]
We investigate the reason why existing end-to-end networks perform poorly in challenging conditions and establish a strong baseline for remote HR measurement with architecture search (NAS)
Comprehensive experiments are performed on three benchmark datasets on both intra-temporal and cross-dataset testing.
arXiv Detail & Related papers (2020-04-26T05:43:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.