Continuous Wrist Control on the Hannes Prosthesis: a Vision-based Shared Autonomy Framework
- URL: http://arxiv.org/abs/2502.17265v1
- Date: Mon, 24 Feb 2025 15:48:25 GMT
- Title: Continuous Wrist Control on the Hannes Prosthesis: a Vision-based Shared Autonomy Framework
- Authors: Federico Vasile, Elisa Maiettini, Giulia Pasquale, Nicolò Boccardo, Lorenzo Natale,
- Abstract summary: Most control techniques for prosthetic grasping focus on dexterous fingers control, but overlook the wrist motion.<n>This forces the user to perform compensatory movements with the elbow, shoulder and hip to adapt the wrist for grasping.<n>We propose a computer vision-based system that leverages the collaboration between the user and an automatic system in a shared autonomy framework.<n>Our pipeline allows to seamlessly control the prosthetic wrist to follow the target object and finally orient it for grasping according to the user intent.
- Score: 5.428117915362002
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most control techniques for prosthetic grasping focus on dexterous fingers control, but overlook the wrist motion. This forces the user to perform compensatory movements with the elbow, shoulder and hip to adapt the wrist for grasping. We propose a computer vision-based system that leverages the collaboration between the user and an automatic system in a shared autonomy framework, to perform continuous control of the wrist degrees of freedom in a prosthetic arm, promoting a more natural approach-to-grasp motion. Our pipeline allows to seamlessly control the prosthetic wrist to follow the target object and finally orient it for grasping according to the user intent. We assess the effectiveness of each system component through quantitative analysis and finally deploy our method on the Hannes prosthetic arm. Code and videos: https://hsp-iit.github.io/hannes-wrist-control.
Related papers
- Bring Your Own Grasp Generator: Leveraging Robot Grasp Generation for Prosthetic Grasping [4.476245767508223]
We present a novel eye-in-hand prosthetic grasping system that follows shared-autonomy principles.
Our system initiates the approach-to-grasp action based on user's command and automatically configures the DoFs of a prosthetic hand.
We deploy our system on the Hannes prosthetic hand and test it on able-bodied subjects and amputees to validate its effectiveness.
arXiv Detail & Related papers (2025-03-01T12:35:05Z) - HOMIE: Humanoid Loco-Manipulation with Isomorphic Exoskeleton Cockpit [52.12750762494588]
Current humanoid teleoperation systems either lack reliable low-level control policies, or struggle to acquire accurate whole-body control commands.<n>We propose a novel humanoid teleoperation cockpit integrates a humanoid loco-manipulation policy and a low-cost exoskeleton-based hardware system.
arXiv Detail & Related papers (2025-02-18T16:33:38Z) - AI-Powered Camera and Sensors for the Rehabilitation Hand Exoskeleton [0.393259574660092]
This project presents a vision-enabled rehabilitation hand exoskeleton to assist disabled persons in their hand movements.
The design goal was to create an accessible tool to help with a simple interface requiring no training.
arXiv Detail & Related papers (2024-08-09T04:47:37Z) - Visual Whole-Body Control for Legged Loco-Manipulation [22.50054654508986]
We study the problem of mobile manipulation using legged robots equipped with an arm.
We propose a framework that can conduct the whole-body control autonomously with visual observations.
arXiv Detail & Related papers (2024-03-25T17:26:08Z) - Neural feels with neural fields: Visuo-tactile perception for in-hand
manipulation [57.60490773016364]
We combine vision and touch sensing on a multi-fingered hand to estimate an object's pose and shape during in-hand manipulation.
Our method, NeuralFeels, encodes object geometry by learning a neural field online and jointly tracks it by optimizing a pose graph problem.
Our results demonstrate that touch, at the very least, refines and, at the very best, disambiguates visual estimates during in-hand manipulation.
arXiv Detail & Related papers (2023-12-20T22:36:37Z) - TLControl: Trajectory and Language Control for Human Motion Synthesis [68.09806223962323]
We present TLControl, a novel method for realistic human motion synthesis.
It incorporates both low-level Trajectory and high-level Language semantics controls.
It is practical for interactive and high-quality animation generation.
arXiv Detail & Related papers (2023-11-28T18:54:16Z) - InterControl: Zero-shot Human Interaction Generation by Controlling Every Joint [67.6297384588837]
We introduce a novel controllable motion generation method, InterControl, to encourage the synthesized motions maintaining the desired distance between joint pairs.
We demonstrate that the distance between joint pairs for human-wise interactions can be generated using an off-the-shelf Large Language Model.
arXiv Detail & Related papers (2023-11-27T14:32:33Z) - OmniControl: Control Any Joint at Any Time for Human Motion Generation [46.293854851116215]
We present a novel approach named OmniControl for incorporating flexible spatial control signals into a text-conditioned human motion generation model.
We propose analytic spatial guidance that ensures the generated motion can tightly conform to the input control signals.
At the same time, realism guidance is introduced to refine all the joints to generate more coherent motion.
arXiv Detail & Related papers (2023-10-12T17:59:38Z) - In-Hand Object Rotation via Rapid Motor Adaptation [59.59946962428837]
We show how to design and learn a simple adaptive controller to achieve in-hand object rotation using only fingertips.
The controller is trained entirely in simulation on only cylindrical objects.
It can be directly deployed to a real robot hand to rotate dozens of objects with diverse sizes, shapes, and weights over the z-axis.
arXiv Detail & Related papers (2022-10-10T17:58:45Z) - QuestSim: Human Motion Tracking from Sparse Sensors with Simulated
Avatars [80.05743236282564]
Real-time tracking of human body motion is crucial for immersive experiences in AR/VR.
We present a reinforcement learning framework that takes in sparse signals from an HMD and two controllers.
We show that a single policy can be robust to diverse locomotion styles, different body sizes, and novel environments.
arXiv Detail & Related papers (2022-09-20T00:25:54Z) - Grasp Pre-shape Selection by Synthetic Training: Eye-in-hand Shared
Control on the Hannes Prosthesis [6.517935794312337]
We present an eye-in-hand learning-based approach for hand pre-shape classification from RGB sequences.
We tackle the peculiarity of the eye-in-hand setting by means of a model for the human arm trajectories.
arXiv Detail & Related papers (2022-03-18T09:16:48Z) - Continuous Decoding of Daily-Life Hand Movements from Forearm Muscle
Activity for Enhanced Myoelectric Control of Hand Prostheses [78.120734120667]
We introduce a novel method, based on a long short-term memory (LSTM) network, to continuously map forearm EMG activity onto hand kinematics.
Ours is the first reported work on the prediction of hand kinematics that uses this challenging dataset.
Our results suggest that the presented method is suitable for the generation of control signals for the independent and proportional actuation of the multiple DOFs of state-of-the-art hand prostheses.
arXiv Detail & Related papers (2021-04-29T00:11:32Z) - Design of an Affordable Prosthetic Arm Equipped with Deep Learning
Vision-Based Manipulation [0.0]
This paper lays the complete outline of the design process of an affordable and easily accessible novel prosthetic arm.
The 3D printed prosthetic arm is equipped with a depth camera and closed-loop off-policy deep learning algorithm to help form grasps to the object in view.
We were able to achieve a 78% grasp success rate on previously unseen objects and generalize across multiple objects for manipulation tasks.
arXiv Detail & Related papers (2021-03-03T00:35:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.