Visual motion analysis of the player's finger
- URL: http://arxiv.org/abs/2303.12697v1
- Date: Fri, 24 Feb 2023 10:14:13 GMT
- Title: Visual motion analysis of the player's finger
- Authors: Marco Costanzo
- Abstract summary: This work is about the extraction of the motion of fingers, in their three articulations, of a keyboard player from a video sequence.
The relevance of the problem involves several aspects, in fact, the extraction of the movements of the fingers may be used to compute the keystroke efficiency and individual joint contributions.
- Score: 3.299672391663527
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work is about the extraction of the motion of fingers, in their three
articulations, of a keyboard player from a video sequence. The relevance of the
problem involves several aspects, in fact, the extraction of the movements of
the fingers may be used to compute the keystroke efficiency and individual
joint contributions, as showed by Werner Goebl and Caroline Palmer in the paper
'Temporal Control and Hand Movement Efficiency in Skilled Music Performance'.
Those measures are directly related to the precision in timing and force
measures. A very good approach to the hand gesture recognition problem has been
presented in the paper ' Real-Time Hand Gesture Recognition Using Finger
Segmentation'. Detecting the keys pressed on a keyboard is a task that can be
complex because of the shadows that can degrade the quality of the result and
possibly cause the detection of not pressed keys. Among the several approaches
that already exist, a great amount of them is based on the subtraction of
frames in order to detect the movements of the keys caused by their pressure.
Detecting the keys that are pressed could be useful to automatically evaluate
the performance of a pianist or to automatically write sheet music of the
melody that is being played.
Related papers
- PianoMotion10M: Dataset and Benchmark for Hand Motion Generation in Piano Performance [15.21347897534943]
We construct a piano-hand motion generation benchmark to guide hand movements and fingerings for piano playing.
To this end, we collect an annotated dataset, PianoMotion10M, consisting of 116 hours of piano playing videos from a bird's-eye view with 10 million annotated hand poses.
arXiv Detail & Related papers (2024-06-13T17:05:23Z) - Video-Mined Task Graphs for Keystep Recognition in Instructional Videos [71.16703750980143]
Procedural activity understanding requires perceiving human actions in terms of a broader task.
We propose discovering a task graph automatically from how-to videos to represent probabilistically how people tend to execute keysteps.
We show the impact: more reliable zero-shot keystep localization and improved video representation learning.
arXiv Detail & Related papers (2023-07-17T18:19:36Z) - At Your Fingertips: Extracting Piano Fingering Instructions from Videos [45.643494669796866]
We consider the AI task of automating the extraction of fingering information from videos.
We show how to perform this task with high-accuracy using a combination of deep-learning modules.
We run the resulting system on 90 videos, resulting in high-quality piano fingering information of 150K notes.
arXiv Detail & Related papers (2023-03-07T09:09:13Z) - Towards Predicting Fine Finger Motions from Ultrasound Images via
Kinematic Representation [12.49914980193329]
We study the inference problem of identifying the activation of specific fingers from a sequence of US images.
We consider this task as an important step towards higher adoption rates of robotic prostheses among arm amputees.
arXiv Detail & Related papers (2022-02-10T18:05:09Z) - End-to-End Learning of Keypoint Representations for Continuous Control
from Images [84.8536730437934]
We show that it is possible to learn efficient keypoint representations end-to-end, without the need for unsupervised pre-training, decoders, or additional losses.
Our proposed architecture consists of a differentiable keypoint extractor that feeds the coordinates directly to a soft actor-critic agent.
arXiv Detail & Related papers (2021-06-15T09:17:06Z) - Towards Learning to Play Piano with Dexterous Hands and Touch [79.48656721563795]
We demonstrate how an agent can learn directly from machine-readable music score to play the piano with dexterous hands on a simulated piano.
We achieve this by using a touch-augmented reward and a novel curriculum of tasks.
arXiv Detail & Related papers (2021-06-03T17:59:31Z) - Latent Fingerprint Registration via Matching Densely Sampled Points [100.53031290339483]
Existing latent fingerprint registration approaches are mainly based on establishing correspondences between minutiae.
We propose a non-minutia latent fingerprint registration method which estimates the spatial transformation between a pair of fingerprints.
The proposed method achieves the state-of-the-art registration performance, especially under challenging conditions.
arXiv Detail & Related papers (2020-05-12T15:51:59Z) - Music Gesture for Visual Sound Separation [121.36275456396075]
"Music Gesture" is a keypoint-based structured representation to explicitly model the body and finger movements of musicians when they perform music.
We first adopt a context-aware graph network to integrate visual semantic context with body dynamics, and then apply an audio-visual fusion model to associate body movements with the corresponding audio signals.
arXiv Detail & Related papers (2020-04-20T17:53:46Z) - Machine Learning for a Music Glove Instrument [0.0]
Music glove instrument equipped with force sensitive, flex and IMU sensors is trained on an electric piano to learn note sequences.
The glove is used on any surface to generate the sequence of notes most closely related to the hand motion.
arXiv Detail & Related papers (2020-01-27T01:08:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.