Artificial Intelligence Enables Real-Time and Intuitive Control of
Prostheses via Nerve Interface
- URL: http://arxiv.org/abs/2203.08648v1
- Date: Wed, 16 Mar 2022 14:33:38 GMT
- Title: Artificial Intelligence Enables Real-Time and Intuitive Control of
Prostheses via Nerve Interface
- Authors: Diu Khue Luu, Anh Tuan Nguyen, Ming Jiang, Markus W. Drealan, Jian Xu,
Tong Wu, Wing-kin Tam, Wenfeng Zhao, Brian Z. H. Lim, Cynthia K. Overstreet,
Qi Zhao, Jonathan Cheng, Edward W. Keefer, Zhi Yang
- Abstract summary: The next generation prosthetic hand that moves and feels like a real hand requires a robust neural interconnection between the human minds and machines.
Here we present a neuroprosthetic system to demonstrate that principle by employing an artificial intelligence (AI) agent to translate the amputee's movement intent through a peripheral nerve interface.
- Score: 25.870454492249863
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Objective: The next generation prosthetic hand that moves and feels like a
real hand requires a robust neural interconnection between the human minds and
machines. Methods: Here we present a neuroprosthetic system to demonstrate that
principle by employing an artificial intelligence (AI) agent to translate the
amputee's movement intent through a peripheral nerve interface. The AI agent is
designed based on the recurrent neural network (RNN) and could simultaneously
decode six degree-of-freedom (DOF) from multichannel nerve data in real-time.
The decoder's performance is characterized in motor decoding experiments with
three human amputees. Results: First, we show the AI agent enables amputees to
intuitively control a prosthetic hand with individual finger and wrist
movements up to 97-98% accuracy. Second, we demonstrate the AI agent's
real-time performance by measuring the reaction time and information throughput
in a hand gesture matching task. Third, we investigate the AI agent's long-term
uses and show the decoder's robust predictive performance over a 16-month
implant duration. Conclusion & significance: Our study demonstrates the
potential of AI-enabled nerve technology, underling the next generation of
dexterous and intuitive prosthetic hands.
Related papers
- Digitizing Touch with an Artificial Multimodal Fingertip [51.7029315337739]
Humans and robots both benefit from using touch to perceive and interact with the surrounding environment.
Here, we describe several conceptual and technological innovations to improve the digitization of touch.
These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities.
arXiv Detail & Related papers (2024-11-04T18:38:50Z) - Incremental procedural and sensorimotor learning in cognitive humanoid
robots [52.77024349608834]
This work presents a cognitive agent that can learn procedures incrementally.
We show the cognitive functions required in each substage and how adding new functions helps address tasks previously unsolved by the agent.
Results show that this approach is capable of solving complex tasks incrementally.
arXiv Detail & Related papers (2023-04-30T22:51:31Z) - Spatiotemporal modeling of grip forces captures proficiency in manual
robot control [5.504040521972806]
This paper builds on our previous work by exploiting Artificial Intelligence to predict individual grip force variability in manual robot control.
Statistical analyses bring to the fore skill specific temporal variations in thousands of grip forces of a complete novice and a highly proficient expert.
arXiv Detail & Related papers (2023-03-03T15:08:00Z) - Robotic Navigation Autonomy for Subretinal Injection via Intelligent
Real-Time Virtual iOCT Volume Slicing [88.99939660183881]
We propose a framework for autonomous robotic navigation for subretinal injection.
Our method consists of an instrument pose estimation method, an online registration between the robotic and the i OCT system, and trajectory planning tailored for navigation to an injection target.
Our experiments on ex-vivo porcine eyes demonstrate the precision and repeatability of the method.
arXiv Detail & Related papers (2023-01-17T21:41:21Z) - A Rubric for Human-like Agents and NeuroAI [2.749726993052939]
Contributed research ranges widely from mimicking behaviour to testing machine learning methods.
It cannot be assumed nor expected that progress on one of these three goals will automatically translate to progress in others.
This is clarified using examples of weak and strong neuroAI and human-like agents.
arXiv Detail & Related papers (2022-12-08T16:59:40Z) - A Portable, Self-Contained Neuroprosthetic Hand with Deep Learning-Based
Finger Control [18.09497225404653]
We present the implementation of a neuroprosthetic hand with embedded deep learning-based control.
The neural decoder is designed based on the recurrent neural network (RNN) architecture and deployed on the NVIDIA Jetson Nano.
This enables the implementation of the neuroprosthetic hand as a portable and self-contained unit with real-time control of individual finger movements.
arXiv Detail & Related papers (2021-03-24T19:11:58Z) - A toolbox for neuromorphic sensing in robotics [4.157415305926584]
We introduce a ROS (Robot Operating System) toolbox to encode and decode input signals coming from any type of sensor available on a robot.
This initiative is meant to stimulate and facilitate robotic integration of neuromorphic AI.
arXiv Detail & Related papers (2021-03-03T23:22:05Z) - Where is my hand? Deep hand segmentation for visual self-recognition in
humanoid robots [129.46920552019247]
We propose the use of a Convolution Neural Network (CNN) to segment the robot hand from an image in an egocentric view.
We fine-tuned the Mask-RCNN network for the specific task of segmenting the hand of the humanoid robot Vizzy.
arXiv Detail & Related papers (2021-02-09T10:34:32Z) - Task-relevant Representation Learning for Networked Robotic Perception [74.0215744125845]
This paper presents an algorithm to learn task-relevant representations of sensory data that are co-designed with a pre-trained robotic perception model's ultimate objective.
Our algorithm aggressively compresses robotic sensory data by up to 11x more than competing methods.
arXiv Detail & Related papers (2020-11-06T07:39:08Z) - BWCNN: Blink to Word, a Real-Time Convolutional Neural Network Approach [5.111743097836832]
We present an Artificial Intelligence (AI) system that uses eye-blinks to communicate with the outside world.
The system uses a Convolutional Neural Network (CNN) to find the blinking pattern, which is defined as a series of Open and Closed states.
arXiv Detail & Related papers (2020-06-01T20:07:44Z) - A Developmental Neuro-Robotics Approach for Boosting the Recognition of
Handwritten Digits [91.3755431537592]
Recent evidence shows that a simulation of the children's embodied strategies can improve the machine intelligence too.
This article explores the application of embodied strategies to convolutional neural network models in the context of developmental neuro-robotics.
arXiv Detail & Related papers (2020-03-23T14:55:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.