Collaborative Robotic Biopsy with Trajectory Guidance and Needle Tip
Force Feedback
- URL: http://arxiv.org/abs/2306.07129v2
- Date: Wed, 12 Jul 2023 09:16:47 GMT
- Title: Collaborative Robotic Biopsy with Trajectory Guidance and Needle Tip
Force Feedback
- Authors: Robin Mieling, Maximilian Neidhardt, Sarah Latus, Carolin Stapper,
Stefan Gerlach, Inga Kniep, Axel Heinemann, Benjamin Ondruschka and Alexander
Schlaefer
- Abstract summary: We present a collaborative robotic biopsy system that combines trajectory guidance with kinesthetic feedback to assist the physician in needle placement.
A needle design that senses forces at the needle tip based on optical coherence tomography and machine learning for real-time data processing.
We demonstrate that even smaller, deep target structures can be accurately sampled by performing post-mortem in situ biopsies of the pancreas.
- Score: 49.32653090178743
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The diagnostic value of biopsies is highly dependent on the placement of
needles. Robotic trajectory guidance has been shown to improve needle
positioning, but feedback for real-time navigation is limited. Haptic display
of needle tip forces can provide rich feedback for needle navigation by
enabling localization of tissue structures along the insertion path. We present
a collaborative robotic biopsy system that combines trajectory guidance with
kinesthetic feedback to assist the physician in needle placement. The robot
aligns the needle while the insertion is performed in collaboration with a
medical expert who controls the needle position on site. We present a needle
design that senses forces at the needle tip based on optical coherence
tomography and machine learning for real-time data processing. Our robotic
setup allows operators to sense deep tissue interfaces independent of
frictional forces to improve needle placement relative to a desired target
structure. We first evaluate needle tip force sensing in ex-vivo tissue in a
phantom study. We characterize the tip forces during insertions with constant
velocity and demonstrate the ability to detect tissue interfaces in a
collaborative user study. Participants are able to detect 91% of ex-vivo tissue
interfaces based on needle tip force feedback alone. Finally, we demonstrate
that even smaller, deep target structures can be accurately sampled by
performing post-mortem in situ biopsies of the pancreas.
Related papers
- Bifurcation Identification for Ultrasound-driven Robotic Cannulation [11.50984693836901]
In trauma and critical care settings, rapid and precise intravascular access is key to patients' survival.
Vessel bifurcations are anatomical landmarks that can guide the safe placement of catheters or needles during medical procedures.
No existing algorithm can autonomously extract vessel bifurcations using ultrasound images.
We introduce BIFURC, a novel algorithm that identifies vessel bifurcations and provides optimal needle insertion sites for an autonomous robotic cannulation system.
arXiv Detail & Related papers (2024-09-10T18:53:52Z) - Real-time guidewire tracking and segmentation in intraoperative x-ray [52.51797358201872]
We propose a two-stage deep learning framework for real-time guidewire segmentation and tracking.
In the first stage, a Yolov5 detector is trained, using the original X-ray images as well as synthetic ones, to output the bounding boxes of possible target guidewires.
In the second stage, a novel and efficient network is proposed to segment the guidewire in each detected bounding box.
arXiv Detail & Related papers (2024-04-12T20:39:19Z) - CathFlow: Self-Supervised Segmentation of Catheters in Interventional Ultrasound Using Optical Flow and Transformers [66.15847237150909]
We introduce a self-supervised deep learning architecture to segment catheters in longitudinal ultrasound images.
The network architecture builds upon AiAReSeg, a segmentation transformer built with the Attention in Attention mechanism.
We validated our model on a test dataset, consisting of unseen synthetic data and images collected from silicon aorta phantoms.
arXiv Detail & Related papers (2024-03-21T15:13:36Z) - EyeLS: Shadow-Guided Instrument Landing System for Intraocular Target
Approaching in Robotic Eye Surgery [51.05595735405451]
Robotic ophthalmic surgery is an emerging technology to facilitate high-precision interventions such as retina penetration in subretinal injection and removal of floating tissues in retinal detachment.
Current image-based methods cannot effectively estimate the needle tip's trajectory towards both retinal and floating targets.
We propose to use the shadow positions of the target and the instrument tip to estimate their relative depth position.
Our method succeeds target approaching on a retina model, and achieves an average depth error of 0.0127 mm and 0.3473 mm for floating and retinal targets respectively in the surgical simulator.
arXiv Detail & Related papers (2023-11-15T09:11:37Z) - Medical needle tip tracking based on Optical Imaging and AI [0.0]
This paper presents an innovative technology for needle tip real-time tracking, aiming for enhanced needle insertion guidance.
Specifically, our approach revolves around the creation of scattering imaging using an optical fiber-equipped needle, and uses Convolutional Neural Network (CNN) based algorithms to enable real-time estimation of the needle tip's position and orientation.
Given the average femoral arterial radius of 4 to 5mm, the proposed system is demonstrated with a great potential for precise needle guidance in femoral artery insertion procedures.
arXiv Detail & Related papers (2023-08-28T10:30:08Z) - Tissue Classification During Needle Insertion Using Self-Supervised
Contrastive Learning and Optical Coherence Tomography [53.38589633687604]
We propose a deep neural network that classifies the tissues from the phase and intensity data of complex OCT signals acquired at the needle tip.
We show that with 10% of the training set, our proposed pretraining strategy helps the model achieve an F1 score of 0.84 whereas the model achieves an F1 score of 0.60 without it.
arXiv Detail & Related papers (2023-04-26T14:11:04Z) - Towards Autonomous Atlas-based Ultrasound Acquisitions in Presence of
Articulated Motion [48.52403516006036]
This paper proposes a vision-based approach allowing autonomous robotic US limb scanning.
To this end, an atlas MRI template of a human arm with annotated vascular structures is used to generate trajectories.
In all cases, the system can successfully acquire the planned vascular structure on volunteers' limbs.
arXiv Detail & Related papers (2022-08-10T15:39:20Z) - A CNN Segmentation-Based Approach to Object Detection and Tracking in
Ultrasound Scans with Application to the Vagus Nerve Detection [17.80391011147757]
We propose a deep learning framework to automatically detect and track a specific anatomical target structure in ultrasound scans.
Our framework is designed to be accurate and robust across subjects and imaging devices, to operate in real-time, and to not require a large training set.
We tested the framework on two different ultrasound datasets with the aim to detect and track the Vagus nerve, where it outperformed current state-of-the-art real-time object detection networks.
arXiv Detail & Related papers (2021-06-25T19:12:46Z) - A Recurrent Neural Network Approach to Roll Estimation for Needle
Steering [5.556129660751467]
Steerable needles are a promising technology for delivering targeted therapies in the body.
Current sensors do not provide full orientation information or interfere with the needle's ability to deliver therapy.
We propose a model-free, learned-method that leverages LSTM neural networks to estimate the needle tip's orientation online.
arXiv Detail & Related papers (2021-01-13T03:40:00Z) - Towards Augmented Reality-based Suturing in Monocular Laparoscopic
Training [0.5707453684578819]
The paper proposes an Augmented Reality environment with quantitative and qualitative visual representations to enhance laparoscopic training outcomes performed on a silicone pad.
This is enabled by a multi-task supervised deep neural network which performs multi-class segmentation and depth map prediction.
The network achieves a dice score of 0.67 for surgical needle segmentation, 0.81 for needle holder instrument segmentation and a mean absolute error of 6.5 mm for depth estimation.
arXiv Detail & Related papers (2020-01-19T19:59:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.