Machine Learning for a Music Glove Instrument
- URL: http://arxiv.org/abs/2001.09551v1
- Date: Mon, 27 Jan 2020 01:08:11 GMT
- Title: Machine Learning for a Music Glove Instrument
- Authors: Joseph Bakarji
- Abstract summary: Music glove instrument equipped with force sensitive, flex and IMU sensors is trained on an electric piano to learn note sequences.
The glove is used on any surface to generate the sequence of notes most closely related to the hand motion.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A music glove instrument equipped with force sensitive, flex and IMU sensors
is trained on an electric piano to learn note sequences based on a time series
of sensor inputs. Once trained, the glove is used on any surface to generate
the sequence of notes most closely related to the hand motion. The data is
collected manually by a performer wearing the glove and playing on an electric
keyboard. The feature space is designed to account for the key hand motion,
such as the thumb-under movement. Logistic regression along with bayesian
belief networks are used learn the transition probabilities from one note to
another. This work demonstrates a data-driven approach for digital musical
instruments in general.
Related papers
- FürElise: Capturing and Physically Synthesizing Hand Motions of Piano Performance [15.909113091360206]
Hand motion models with the sophistication to accurately recreate piano playing have a wide range of applications in character animation, embodied AI, biomechanics, and VR/AR.
In this paper, we construct a first-of-its-kind large-scale dataset that contains approximately 10 hours of 3D hand motion and audio from 15 elite-level pianists playing 153 pieces of classical music.
arXiv Detail & Related papers (2024-10-08T08:21:05Z) - PianoMotion10M: Dataset and Benchmark for Hand Motion Generation in Piano Performance [15.21347897534943]
We construct a piano-hand motion generation benchmark to guide hand movements and fingerings for piano playing.
To this end, we collect an annotated dataset, PianoMotion10M, consisting of 116 hours of piano playing videos from a bird's-eye view with 10 million annotated hand poses.
arXiv Detail & Related papers (2024-06-13T17:05:23Z) - Learning Visuotactile Skills with Two Multifingered Hands [80.99370364907278]
We explore learning from human demonstrations using a bimanual system with multifingered hands and visuotactile data.
Our results mark a promising step forward in bimanual multifingered manipulation from visuotactile data.
arXiv Detail & Related papers (2024-04-25T17:59:41Z) - Modeling Bends in Popular Music Guitar Tablatures [49.64902130083662]
Tablature notation is widely used in popular music to transcribe and share guitar musical content.
This paper focuses on bends, which enable to progressively shift the pitch of a note, therefore circumventing physical limitations of the discrete fretted fingerboard.
Experiments are performed on a corpus of 932 lead guitar tablatures of popular music and show that a decision tree successfully predicts bend occurrences with an F1 score of 0.71 anda limited amount of false positive predictions.
arXiv Detail & Related papers (2023-08-22T07:50:58Z) - At Your Fingertips: Extracting Piano Fingering Instructions from Videos [45.643494669796866]
We consider the AI task of automating the extraction of fingering information from videos.
We show how to perform this task with high-accuracy using a combination of deep-learning modules.
We run the resulting system on 90 videos, resulting in high-quality piano fingering information of 150K notes.
arXiv Detail & Related papers (2023-03-07T09:09:13Z) - Visual motion analysis of the player's finger [3.299672391663527]
This work is about the extraction of the motion of fingers, in their three articulations, of a keyboard player from a video sequence.
The relevance of the problem involves several aspects, in fact, the extraction of the movements of the fingers may be used to compute the keystroke efficiency and individual joint contributions.
arXiv Detail & Related papers (2023-02-24T10:14:13Z) - Reconfigurable Data Glove for Reconstructing Physical and Virtual Grasps [100.72245315180433]
We present a reconfigurable data glove design to capture different modes of human hand-object interactions.
The glove operates in three modes for various downstream tasks with distinct features.
We evaluate the system's three modes by (i) recording hand gestures and associated forces, (ii) improving manipulation fluency in VR, and (iii) producing realistic simulation effects of various tool uses.
arXiv Detail & Related papers (2023-01-14T05:35:50Z) - Towards Automatic Instrumentation by Learning to Separate Parts in
Symbolic Multitrack Music [33.679951600368405]
We study the feasibility of automatic instrumentation -- dynamically assigning instruments to notes in solo music during performance.
In addition to the online, real-time-capable setting for performative use cases, automatic instrumentation can also find applications in assistive composing tools in an offline setting.
We frame the task of part separation as a sequential multi-class classification problem and adopt machine learning to map sequences of notes into sequences of part labels.
arXiv Detail & Related papers (2021-07-13T08:34:44Z) - Towards Learning to Play Piano with Dexterous Hands and Touch [79.48656721563795]
We demonstrate how an agent can learn directly from machine-readable music score to play the piano with dexterous hands on a simulated piano.
We achieve this by using a touch-augmented reward and a novel curriculum of tasks.
arXiv Detail & Related papers (2021-06-03T17:59:31Z) - Music Gesture for Visual Sound Separation [121.36275456396075]
"Music Gesture" is a keypoint-based structured representation to explicitly model the body and finger movements of musicians when they perform music.
We first adopt a context-aware graph network to integrate visual semantic context with body dynamics, and then apply an audio-visual fusion model to associate body movements with the corresponding audio signals.
arXiv Detail & Related papers (2020-04-20T17:53:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.