Learning and Online Replication of Grasp Forces from Electromyography Signals for Prosthetic Finger Control
- URL: http://arxiv.org/abs/2505.02574v1
- Date: Mon, 05 May 2025 11:23:51 GMT
- Title: Learning and Online Replication of Grasp Forces from Electromyography Signals for Prosthetic Finger Control
- Authors: Robin Arbaud, Elisa Motta, Marco Domenico Avaro, Stefano Picinich, Marta Lorenzini, Arash Ajoudani,
- Abstract summary: Partial hand amputations significantly affect the physical and psychosocial well-being of individuals.<n>We developed a force-controlled prosthetic finger activated by electromyography (EMG) signals.<n>A neural network-based model was then implemented to estimate fingertip forces from EMG inputs, allowing for online adjustment of the prosthetic finger grip strength.
- Score: 10.437235109088517
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Partial hand amputations significantly affect the physical and psychosocial well-being of individuals, yet intuitive control of externally powered prostheses remains an open challenge. To address this gap, we developed a force-controlled prosthetic finger activated by electromyography (EMG) signals. The prototype, constructed around a wrist brace, functions as a supernumerary finger placed near the index, allowing for early-stage evaluation on unimpaired subjects. A neural network-based model was then implemented to estimate fingertip forces from EMG inputs, allowing for online adjustment of the prosthetic finger grip strength. The force estimation model was validated through experiments with ten participants, demonstrating its effectiveness in predicting forces. Additionally, online trials with four users wearing the prosthesis exhibited precise control over the device. Our findings highlight the potential of using EMG-based force estimation to enhance the functionality of prosthetic fingers.
Related papers
- Feel the Force: Contact-Driven Learning from Humans [52.36160086934298]
Controlling fine-grained forces during manipulation remains a core challenge in robotics.<n>We present FeelTheForce, a robot learning system that models human tactile behavior to learn force-sensitive manipulation.<n>Our approach grounds robust low-level force control in scalable human supervision, achieving a 77% success rate across 5 force-sensitive manipulation tasks.
arXiv Detail & Related papers (2025-06-02T17:57:52Z) - BRAVE: Brain-Controlled Prosthetic Arm with Voice Integration and Embodied Learning for Enhanced Mobility [5.528262076322921]
BRAVE is a hybrid EEG and voice-controlled prosthetic system.<n>It aims to interpret EEG-driven motor intent, enabling movement control without reliance on residual muscle activity.<n>The system operates in real time, with a response latency of 150 ms.
arXiv Detail & Related papers (2025-05-23T11:44:33Z) - Online Adaptation for Myographic Control of Natural Dexterous Hand and Finger Movements [0.6741087029030101]
This work redefines the state-of-the-art in myographic decoding in terms of the reliability, responsiveness, and movement complexity available from prosthesis control systems.
arXiv Detail & Related papers (2024-12-23T21:20:32Z) - emg2qwerty: A Large Dataset with Baselines for Touch Typing using Surface Electromyography [47.160223334501126]
emg2qwerty is a large-scale dataset of non-invasive electromyographic signals recorded at the wrists while touch typing on a QWERTY keyboard.<n>With 1,135 sessions spanning 108 users and 346 hours of recording, this is the largest such public dataset to date.<n>We show strong baseline performance on predicting key-presses using sEMG signals alone.
arXiv Detail & Related papers (2024-10-26T05:18:48Z) - PAD-Phys: Exploiting Physiology for Presentation Attack Detection in
Face Biometrics [48.683457383784145]
Three approaches to presentation attack detection based on r: (i) the physiological domain, (ii) the Deepfakes domain, and (iii) a new Presentation Attack domain.
Results show a 21.70% decrease in average classification error rate (ACER) when the presentation attack domain is compared to the physiological and Deepfakes domains.
Experiments highlight the efficiency of transfer learning in r-based models and perform well in presentation attack detection in instruments that do not allow copying of this physiological feature.
arXiv Detail & Related papers (2023-10-03T15:24:15Z) - Agile gesture recognition for capacitive sensing devices: adapting
on-the-job [55.40855017016652]
We demonstrate a hand gesture recognition system that uses signals from capacitive sensors embedded into the etee hand controller.
The controller generates real-time signals from each of the wearer five fingers.
We use a machine learning technique to analyse the time series signals and identify three features that can represent 5 fingers within 500 ms.
arXiv Detail & Related papers (2023-05-12T17:24:02Z) - Spatiotemporal modeling of grip forces captures proficiency in manual
robot control [5.504040521972806]
This paper builds on our previous work by exploiting Artificial Intelligence to predict individual grip force variability in manual robot control.
Statistical analyses bring to the fore skill specific temporal variations in thousands of grip forces of a complete novice and a highly proficient expert.
arXiv Detail & Related papers (2023-03-03T15:08:00Z) - A Prototype System for High Frame Rate Ultrasound Imaging based
Prosthetic Arm Control [3.0938904602244355]
Prototype system for high frame rate ultrasound imaging for prosthetic arm control is proposed.
A virtual robotic hand simulation is developed that can mimic a human hand.
The proposed classification model simulating four hand gestures has a classification accuracy of more than 90%.
arXiv Detail & Related papers (2023-01-31T17:53:16Z) - fMRI Neurofeedback Learning Patterns are Predictive of Personal and
Clinical Traits [62.997667081978825]
We obtain a personal signature of a person's learning progress in a self-neuromodulation task, guided by functional MRI (fMRI)
The signature is based on predicting the activity of the Amygdala in a second neurofeedback session, given a similar fMRI-derived brain state in the first session.
arXiv Detail & Related papers (2021-12-21T06:52:48Z) - Continuous Decoding of Daily-Life Hand Movements from Forearm Muscle
Activity for Enhanced Myoelectric Control of Hand Prostheses [78.120734120667]
We introduce a novel method, based on a long short-term memory (LSTM) network, to continuously map forearm EMG activity onto hand kinematics.
Ours is the first reported work on the prediction of hand kinematics that uses this challenging dataset.
Our results suggest that the presented method is suitable for the generation of control signals for the independent and proportional actuation of the multiple DOFs of state-of-the-art hand prostheses.
arXiv Detail & Related papers (2021-04-29T00:11:32Z) - Multimodal Fusion of EMG and Vision for Human Grasp Intent Inference in
Prosthetic Hand Control [11.400385533782204]
We present a Bayesian evidence fusion framework for grasp intent inference using eye-view video, eye-gaze, and EMG from the forearm.
We analyze individual and fused performance as a function of time as the hand approaches the object to grasp it.
arXiv Detail & Related papers (2021-04-08T17:01:19Z) - Heterogeneous Hand Guise Classification Based on Surface
Electromyographic Signals Using Multichannel Convolutional Neural Network [0.0]
Recent developments in the field of Machine Learning allow us to use EMG signals to teach machines the complex properties of human movements.
Modern machines are capable of detecting numerous human activities and distinguishing among them solely based on the EMG signals produced by those activities.
In this study, a novel classification method has been described employing a multichannel Convolutional Neural Network (CNN) that interprets surface EMG signals by the properties they exhibit in the power domain.
arXiv Detail & Related papers (2021-01-17T17:02:04Z) - Towards Creating a Deployable Grasp Type Probability Estimator for a
Prosthetic Hand [11.008123712007402]
InceptionV3 achieves highest accuracy with 0.95 angular similarity followed by 1.4 MobileNetV2 with 0.93 at 20% the amount of operations.
Our work enables augmenting EMG intent inference with physical state probability through machine learning and computer vision method.
arXiv Detail & Related papers (2021-01-13T21:39:41Z) - Detecting Parkinsonian Tremor from IMU Data Collected In-The-Wild using
Deep Multiple-Instance Learning [59.74684475991192]
Parkinson's Disease (PD) is a slowly evolving neuro-logical disease that affects about 1% of the population above 60 years old.
PD symptoms include tremor, rigidity and braykinesia.
We present a method for automatically identifying tremorous episodes related to PD, based on IMU signals captured via a smartphone device.
arXiv Detail & Related papers (2020-05-06T09:02:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.