Hand gesture detection in tests performed by older adults
- URL: http://arxiv.org/abs/2110.14461v2
- Date: Fri, 29 Oct 2021 00:48:51 GMT
- Title: Hand gesture detection in tests performed by older adults
- Authors: Guan Huang and Son N. Tran and Quan Bai and Jane Alty
- Abstract summary: We are developing an online test that analyses hand movement features associated with ageing.
To obtain hand movement features, participants will be asked to perform a variety of hand gestures using their own computer cameras.
It is challenging to collect high quality hand movement video data, especially for older participants.
- Score: 9.00837522898458
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Our team are developing a new online test that analyses hand movement
features associated with ageing that can be completed remotely from the
research centre. To obtain hand movement features, participants will be asked
to perform a variety of hand gestures using their own computer cameras.
However, it is challenging to collect high quality hand movement video data,
especially for older participants, many of whom have no IT background. During
the data collection process, one of the key steps is to detect whether the
participants are following the test instructions correctly and also to detect
similar gestures from different devices. Furthermore, we need this process to
be automated and accurate as we expect many thousands of participants to
complete the test. We have implemented a hand gesture detector to detect the
gestures in the hand movement tests and our detection mAP is 0.782 which is
better than the state-of-the-art. In this research, we have processed 20,000
images collected from hand movement tests and labelled 6,450 images to detect
different hand gestures in the hand movement tests. This paper has the
following three contributions. Firstly, we compared and analysed the
performance of different network structures for hand gesture detection.
Secondly, we have made many attempts to improve the accuracy of the model and
have succeeded in improving the classification accuracy for similar gestures by
implementing attention layers. Thirdly, we have created two datasets and
included 20 percent of blurred images in the dataset to investigate how
different network structures were impacted by noisy data, our experiments have
also shown our network has better performance on the noisy dataset.
Related papers
- Learning Visuotactile Skills with Two Multifingered Hands [80.99370364907278]
We explore learning from human demonstrations using a bimanual system with multifingered hands and visuotactile data.
Our results mark a promising step forward in bimanual multifingered manipulation from visuotactile data.
arXiv Detail & Related papers (2024-04-25T17:59:41Z) - Agile gesture recognition for capacitive sensing devices: adapting
on-the-job [55.40855017016652]
We demonstrate a hand gesture recognition system that uses signals from capacitive sensors embedded into the etee hand controller.
The controller generates real-time signals from each of the wearer five fingers.
We use a machine learning technique to analyse the time series signals and identify three features that can represent 5 fingers within 500 ms.
arXiv Detail & Related papers (2023-05-12T17:24:02Z) - Real-Time Hand Gesture Identification in Thermal Images [0.0]
Our system is capable of handling multiple hand regions in a frame and process it fast for real-time applications.
We collected a new thermal image data set with 10 gestures and reported an end-to-end hand gesture recognition accuracy of 97%.
arXiv Detail & Related papers (2023-03-04T05:02:35Z) - Simultaneous prediction of hand gestures, handedness, and hand keypoints
using thermal images [0.6087960723103347]
We propose a technique for simultaneous hand gesture classification, handedness detection, and hand keypoints localization using thermal data captured by an infrared camera.
Our method uses a novel deep multi-task learning architecture that includes shared encoderdecoder layers followed by three branches dedicated for each mentioned task.
arXiv Detail & Related papers (2023-03-02T19:25:40Z) - Efficient Gesture Recognition for the Assistance of Visually Impaired
People using Multi-Head Neural Networks [5.883916678819684]
This paper proposes an interactive system for mobile devices controlled by hand gestures aimed at helping people with visual impairments.
This system allows the user to interact with the device by making simple static and dynamic hand gestures.
Each gesture triggers a different action in the system, such as object recognition, scene description or image scaling.
arXiv Detail & Related papers (2022-05-14T06:01:47Z) - Learning to Disambiguate Strongly Interacting Hands via Probabilistic
Per-pixel Part Segmentation [84.28064034301445]
Self-similarity, and the resulting ambiguities in assigning pixel observations to the respective hands, is a major cause of the final 3D pose error.
We propose DIGIT, a novel method for estimating the 3D poses of two interacting hands from a single monocular image.
We experimentally show that the proposed approach achieves new state-of-the-art performance on the InterHand2.6M dataset.
arXiv Detail & Related papers (2021-07-01T13:28:02Z) - SHREC 2021: Track on Skeleton-based Hand Gesture Recognition in the Wild [62.450907796261646]
Recognition of hand gestures can be performed directly from the stream of hand skeletons estimated by software.
Despite the recent advancements in gesture and action recognition from skeletons, it is unclear how well the current state-of-the-art techniques can perform in a real-world scenario.
This paper presents the results of the SHREC 2021: Track on Skeleton-based Hand Gesture Recognition in the Wild contest.
arXiv Detail & Related papers (2021-06-21T10:57:49Z) - Where is my hand? Deep hand segmentation for visual self-recognition in
humanoid robots [129.46920552019247]
We propose the use of a Convolution Neural Network (CNN) to segment the robot hand from an image in an egocentric view.
We fine-tuned the Mask-RCNN network for the specific task of segmenting the hand of the humanoid robot Vizzy.
arXiv Detail & Related papers (2021-02-09T10:34:32Z) - Understanding the hand-gestures using Convolutional Neural Networks and
Generative Adversial Networks [0.0]
The system consists of three modules: real time hand tracking, training gesture and gesture recognition using Convolutional Neural Networks.
It has been tested to the vocabulary of 36 gestures including the alphabets and digits, and results effectiveness of the approach.
arXiv Detail & Related papers (2020-11-10T02:20:43Z) - Visual Imitation Made Easy [102.36509665008732]
We present an alternate interface for imitation that simplifies the data collection process while allowing for easy transfer to robots.
We use commercially available reacher-grabber assistive tools both as a data collection device and as the robot's end-effector.
We experimentally evaluate on two challenging tasks: non-prehensile pushing and prehensile stacking, with 1000 diverse demonstrations for each task.
arXiv Detail & Related papers (2020-08-11T17:58:50Z) - Force myography benchmark data for hand gesture recognition and transfer
learning [5.110894308882439]
We contribute to the advancement of this field by making accessible a benchmark dataset collected using a commercially available sensor setup from 20 persons covering 18 unique gestures.
We illustrate one use-case for such data, showing how we can improve gesture recognition accuracy by utilising transfer learning to incorporate data from multiple other persons.
arXiv Detail & Related papers (2020-07-29T15:43:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.