MindArm: Mechanized Intelligent Non-Invasive Neuro-Driven Prosthetic Arm System
- URL: http://arxiv.org/abs/2403.19992v2
- Date: Sat, 19 Oct 2024 18:23:46 GMT
- Title: MindArm: Mechanized Intelligent Non-Invasive Neuro-Driven Prosthetic Arm System
- Authors: Maha Nawaz, Abdul Basit, Muhammad Shafique,
- Abstract summary: MindArm employs a deep neural network (DNN) to translate brain signals, captured by low-cost surface electroencephalogram (EEG) electrodes, into prosthetic arm movements.
The system costs approximately $500-550, including $400 for the EEG headset and $100-150 for motors, 3D printing, and assembly.
- Score: 5.528262076322921
- License:
- Abstract: Currently, individuals with arm mobility impairments (referred to as "patients") face limited technological solutions due to two key challenges: (1) non-invasive prosthetic devices are often prohibitively expensive and costly to maintain, and (2) invasive solutions require high-risk, costly brain surgery, which can pose a health risk. Therefore, current technological solutions are not accessible for all patients with different financial backgrounds. Toward this, we propose a low-cost technological solution called MindArm, an affordable, non-invasive neuro-driven prosthetic arm system. MindArm employs a deep neural network (DNN) to translate brain signals, captured by low-cost surface electroencephalogram (EEG) electrodes, into prosthetic arm movements. Utilizing an Open Brain Computer Interface and UDP networking for signal processing, the system seamlessly controls arm motion. In the compute module, we run a trained DNN model to interpret filtered micro-voltage brain signals, and then translate them into a prosthetic arm action via serial communication seamlessly. Experimental results from a fully functional prototype show high accuracy across three actions, with 91% for idle/stationary, 85% for handshake, and 84% for cup pickup. The system costs approximately $500-550, including $400 for the EEG headset and $100-150 for motors, 3D printing, and assembly, offering an affordable alternative for mind-controlled prosthetic devices.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Deep Neural Network Architecture Search for Accurate Visual Pose
Estimation aboard Nano-UAVs [69.19616451596342]
Miniaturized unmanned aerial vehicles (UAVs) are an emerging and trending topic.
We leverage a novel neural architecture search (NAS) technique to automatically identify several convolutional neural networks (CNNs) for a visual pose estimation task.
Our results improve the State-of-the-Art by reducing the in-field control error of 32% while achieving a real-time onboard inference-rate of 10Hz@10mW and 50Hz@90mW.
arXiv Detail & Related papers (2023-03-03T14:02:09Z) - NeRF in the Palm of Your Hand: Corrective Augmentation for Robotics via
Novel-View Synthesis [50.93065653283523]
SPARTN (Synthetic Perturbations for Augmenting Robot Trajectories via NeRF) is a fully-offline data augmentation scheme for improving robot policies.
Our approach leverages neural radiance fields (NeRFs) to synthetically inject corrective noise into visual demonstrations.
In a simulated 6-DoF visual grasping benchmark, SPARTN improves success rates by 2.8$times$ over imitation learning without the corrective augmentations.
arXiv Detail & Related papers (2023-01-18T23:25:27Z) - Robotic Navigation Autonomy for Subretinal Injection via Intelligent
Real-Time Virtual iOCT Volume Slicing [88.99939660183881]
We propose a framework for autonomous robotic navigation for subretinal injection.
Our method consists of an instrument pose estimation method, an online registration between the robotic and the i OCT system, and trajectory planning tailored for navigation to an injection target.
Our experiments on ex-vivo porcine eyes demonstrate the precision and repeatability of the method.
arXiv Detail & Related papers (2023-01-17T21:41:21Z) - In the realm of hybrid Brain: Human Brain and AI [0.0]
Current brain-computer interface (BCI) technology is mainly on therapeutic outcomes.
Recently, artificial intelligence (AI) and machine learning (ML) technologies have been used to decode brain signals.
We envision the development of closed loop, intelligent, low-power, and miniaturized neural interfaces.
arXiv Detail & Related papers (2022-10-04T08:35:34Z) - Active Predicting Coding: Brain-Inspired Reinforcement Learning for
Sparse Reward Robotic Control Problems [79.07468367923619]
We propose a backpropagation-free approach to robotic control through the neuro-cognitive computational framework of neural generative coding (NGC)
We design an agent built completely from powerful predictive coding/processing circuits that facilitate dynamic, online learning from sparse rewards.
We show that our proposed ActPC agent performs well in the face of sparse (extrinsic) reward signals and is competitive with or outperforms several powerful backprop-based RL approaches.
arXiv Detail & Related papers (2022-09-19T16:49:32Z) - Artificial Intelligence Enables Real-Time and Intuitive Control of
Prostheses via Nerve Interface [25.870454492249863]
The next generation prosthetic hand that moves and feels like a real hand requires a robust neural interconnection between the human minds and machines.
Here we present a neuroprosthetic system to demonstrate that principle by employing an artificial intelligence (AI) agent to translate the amputee's movement intent through a peripheral nerve interface.
arXiv Detail & Related papers (2022-03-16T14:33:38Z) - A Portable, Self-Contained Neuroprosthetic Hand with Deep Learning-Based
Finger Control [18.09497225404653]
We present the implementation of a neuroprosthetic hand with embedded deep learning-based control.
The neural decoder is designed based on the recurrent neural network (RNN) architecture and deployed on the NVIDIA Jetson Nano.
This enables the implementation of the neuroprosthetic hand as a portable and self-contained unit with real-time control of individual finger movements.
arXiv Detail & Related papers (2021-03-24T19:11:58Z) - Design of an Affordable Prosthetic Arm Equipped with Deep Learning
Vision-Based Manipulation [0.0]
This paper lays the complete outline of the design process of an affordable and easily accessible novel prosthetic arm.
The 3D printed prosthetic arm is equipped with a depth camera and closed-loop off-policy deep learning algorithm to help form grasps to the object in view.
We were able to achieve a 78% grasp success rate on previously unseen objects and generalize across multiple objects for manipulation tasks.
arXiv Detail & Related papers (2021-03-03T00:35:06Z) - OmniTact: A Multi-Directional High Resolution Touch Sensor [109.28703530853542]
Existing tactile sensors are either flat, have small sensitive fields or only provide low-resolution signals.
We introduce OmniTact, a multi-directional high-resolution tactile sensor.
We evaluate the capabilities of OmniTact on a challenging robotic control task.
arXiv Detail & Related papers (2020-03-16T01:31:29Z) - DeepBrain: Towards Personalized EEG Interaction through Attentional and
Embedded LSTM Learning [20.300051894095173]
We propose an end-to-end solution that enables fine brain-robot interaction (BRI) through embedded learning of coarse EEG signals from the low-cost devices, namely DeepBrain.
Our contributions are two folds: 1) We present a stacked long short term memory (Stacked LSTM) structure with specific pre-processing techniques to handle the time-dependency of EEG signals and their classification.
Our real-world experiments demonstrate that the proposed end-to-end solution with low cost can achieve satisfactory run-time speed, accuracy and energy-efficiency.
arXiv Detail & Related papers (2020-02-06T03:34:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.