Mapping Surgeon's Hand/Finger Motion During Conventional Microsurgery to
Enhance Intuitive Surgical Robot Teleoperation
- URL: http://arxiv.org/abs/2102.10585v1
- Date: Sun, 21 Feb 2021 11:21:30 GMT
- Title: Mapping Surgeon's Hand/Finger Motion During Conventional Microsurgery to
Enhance Intuitive Surgical Robot Teleoperation
- Authors: Mohammad Fattahi Sani, Raimondo Ascione, Sanja Dogramadzi
- Abstract summary: Current human-robot interfaces lack intuitive teleoperation and cannot mimic surgeon's hand/finger sensing and fine motion.
We report a pilot study showing an intuitive way of recording and mapping surgeon's gross hand motion and the fine synergic motion during cardiac micro-surgery.
- Score: 0.5635300481123077
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Purpose: Recent developments in robotics and artificial intelligence (AI)
have led to significant advances in healthcare technologies enhancing
robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties.
However, current human-robot interfaces lack intuitive teleoperation and cannot
mimic surgeon's hand/finger sensing and fine motion. These limitations make
tele-operated robotic surgery not suitable for micro-surgery and difficult to
learn for established surgeons. We report a pilot study showing an intuitive
way of recording and mapping surgeon's gross hand motion and the fine synergic
motion during cardiac micro-surgery as a way to enhance future intuitive
teleoperation. Methods: We set to develop a prototype system able to train a
Deep Neural Net-work (DNN) by mapping wrist, hand and surgical tool real-time
data acquisition(RTDA) inputs during mock-up heart micro-surgery procedures.
The trained network was used to estimate the tools poses from refined hand
joint angles. Results: Based on surgeon's feedback during mock micro-surgery,
the developed wearable system with light-weight sensors for motion tracking did
not interfere with the surgery and instrument handling. The wearable motion
tracking system used 15 finger-thumb-wrist joint angle sensors to generate
meaningful data-sets representing inputs of the DNN network with new hand joint
angles added as necessary based on comparing the estimated tool poses against
measured tool pose. The DNN architecture was optimized for the highest
estimation accuracy and the ability to determine the tool pose with the least
mean squared error. This novel approach showed that the surgical instrument's
pose, an essential requirement for teleoperation, can be accurately estimated
from recorded surgeon's hand/finger movements with a mean squared error (MSE)
less than 0.3%
Related papers
- Creating a Digital Twin of Spinal Surgery: A Proof of Concept [68.37190859183663]
Surgery digitalization is the process of creating a virtual replica of real-world surgery.
We present a proof of concept (PoC) for surgery digitalization that is applied to an ex-vivo spinal surgery.
We employ five RGB-D cameras for dynamic 3D reconstruction of the surgeon, a high-end camera for 3D reconstruction of the anatomy, an infrared stereo camera for surgical instrument tracking, and a laser scanner for 3D reconstruction of the operating room and data fusion.
arXiv Detail & Related papers (2024-03-25T13:09:40Z) - Surgical tool classification and localization: results and methods from
the MICCAI 2022 SurgToolLoc challenge [69.91670788430162]
We present the results of the SurgLoc 2022 challenge.
The goal was to leverage tool presence data as weak labels for machine learning models trained to detect tools.
We conclude by discussing these results in the broader context of machine learning and surgical data science.
arXiv Detail & Related papers (2023-05-11T21:44:39Z) - Next-generation Surgical Navigation: Marker-less Multi-view 6DoF Pose
Estimation of Surgical Instruments [66.74633676595889]
We present a multi-camera capture setup consisting of static and head-mounted cameras.
Second, we publish a multi-view RGB-D video dataset of ex-vivo spine surgeries, captured in a surgical wet lab and a real operating theatre.
Third, we evaluate three state-of-the-art single-view and multi-view methods for the task of 6DoF pose estimation of surgical instruments.
arXiv Detail & Related papers (2023-05-05T13:42:19Z) - Live image-based neurosurgical guidance and roadmap generation using
unsupervised embedding [53.992124594124896]
We present a method for live image-only guidance leveraging a large data set of annotated neurosurgical videos.
A generated roadmap encodes the common anatomical paths taken in surgeries in the training set.
We trained and evaluated the proposed method with a data set of 166 transsphenoidal adenomectomy procedures.
arXiv Detail & Related papers (2023-03-31T12:52:24Z) - Robotic Navigation Autonomy for Subretinal Injection via Intelligent
Real-Time Virtual iOCT Volume Slicing [88.99939660183881]
We propose a framework for autonomous robotic navigation for subretinal injection.
Our method consists of an instrument pose estimation method, an online registration between the robotic and the i OCT system, and trajectory planning tailored for navigation to an injection target.
Our experiments on ex-vivo porcine eyes demonstrate the precision and repeatability of the method.
arXiv Detail & Related papers (2023-01-17T21:41:21Z) - Using Hand Pose Estimation To Automate Open Surgery Training Feedback [0.0]
This research aims to facilitate the use of state-of-the-art computer vision algorithms for the automated training of surgeons.
By estimating 2D hand poses, we model the movement of the practitioner's hands, and their interaction with surgical instruments.
arXiv Detail & Related papers (2022-11-13T21:47:31Z) - Using Computer Vision to Automate Hand Detection and Tracking of Surgeon
Movements in Videos of Open Surgery [8.095095522269352]
We leverage advances in computer vision to introduce an automated approach to video analysis of surgical execution.
A state-of-the-art convolutional neural network architecture for object detection was used to detect operating hands in open surgery videos.
Our model's spatial detections of operating hands significantly outperforms the detections achieved using pre-existing hand-detection datasets.
arXiv Detail & Related papers (2020-12-13T03:10:09Z) - Using Conditional Generative Adversarial Networks to Reduce the Effects
of Latency in Robotic Telesurgery [0.0]
In surgery, any micro-delay can injure a patient severely and in some cases, result in fatality.
Current surgical robots use calibrated sensors to measure the position of the arms and tools.
In this work we present a purely optical approach that provides a measurement of the tool position in relation to the patient's tissues.
arXiv Detail & Related papers (2020-10-07T13:40:44Z) - Real-Time Instrument Segmentation in Robotic Surgery using Auxiliary
Supervised Deep Adversarial Learning [15.490603884631764]
Real-time semantic segmentation of the robotic instruments and tissues is a crucial step in robot-assisted surgery.
We have developed a light-weight cascaded convolutional neural network (CNN) to segment the surgical instruments from high-resolution videos.
We show that our model surpasses existing algorithms for pixel-wise segmentation of surgical instruments in both prediction accuracy and segmentation time of high-resolution videos.
arXiv Detail & Related papers (2020-07-22T10:16:07Z) - Searching for Efficient Architecture for Instrument Segmentation in
Robotic Surgery [58.63306322525082]
Most applications rely on accurate real-time segmentation of high-resolution surgical images.
We design a light-weight and highly-efficient deep residual architecture which is tuned to perform real-time inference of high-resolution images.
arXiv Detail & Related papers (2020-07-08T21:38:29Z) - Recurrent and Spiking Modeling of Sparse Surgical Kinematics [0.8458020117487898]
A growing number of studies have used machine learning to analyze video and kinematic data captured from surgical robots.
In this study, we explore the possibility of using only kinematic data to predict surgeons of similar skill levels.
We report that it is possible to identify surgical fellows receiving near perfect scores in the simulation exercises based on their motion characteristics alone.
arXiv Detail & Related papers (2020-05-12T15:41:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.