Automatic breach detection during spine pedicle drilling based on
vibroacoustic sensing
- URL: http://arxiv.org/abs/2303.15114v1
- Date: Mon, 27 Mar 2023 11:32:14 GMT
- Title: Automatic breach detection during spine pedicle drilling based on
vibroacoustic sensing
- Authors: Aidana Massalimova, Maikel Timmermans, Nicola Cavalcanti, Daniel
Suter, Matthias Seibold, Fabio Carrillo, Christoph J. Laux, Reto Sutter,
Mazda Farshad, Kathleen Denis, Philipp F\"urnstahl
- Abstract summary: This work proposes a new radiation-free breach detection algorithm leveraging a non-visual sensor setup in combination with deep learning approach.
Multiple vibroacoustic sensors, such as a contact microphone, a free-field microphone, a tri-axial accelerometer, and an optical tracking system were integrated into the setup.
The proposed method shows the great potential of non-visual sensor fusion for avoiding screw misplacement and accidental bone breaches during pedicle drilling.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Pedicle drilling is a complex and critical spinal surgery task. Detecting
breach or penetration of the surgical tool to the cortical wall during
pilot-hole drilling is essential to avoid damage to vital anatomical structures
adjacent to the pedicle, such as the spinal cord, blood vessels, and nerves.
Currently, the guidance of pedicle drilling is done using image-guided methods
that are radiation intensive and limited to the preoperative information. This
work proposes a new radiation-free breach detection algorithm leveraging a
non-visual sensor setup in combination with deep learning approach. Multiple
vibroacoustic sensors, such as a contact microphone, a free-field microphone, a
tri-axial accelerometer, a uni-axial accelerometer, and an optical tracking
system were integrated into the setup. Data were collected on four cadaveric
human spines, ranging from L5 to T10. An experienced spine surgeon drilled the
pedicles relying on optical navigation. A new automatic labeling method based
on the tracking data was introduced. Labeled data was subsequently fed to the
network in mel-spectrograms, classifying the data into breach and non-breach.
Different sensor types, sensor positioning, and their combinations were
evaluated. The best results in breach recall for individual sensors could be
achieved using contact microphones attached to the dorsal skin (85.8\%) and
uni-axial accelerometers clamped to the spinous process of the drilled vertebra
(81.0\%). The best-performing data fusion model combined the latter two sensors
with a breach recall of 98\%. The proposed method shows the great potential of
non-visual sensor fusion for avoiding screw misplacement and accidental bone
breaches during pedicle drilling and could be extended to further surgical
applications.
Related papers
- Attention on the Wires (AttWire): A Foundation Model for Detecting Devices and Catheters in X-ray Fluoroscopic Images [0.4064887614767072]
A novel attention mechanism was designed to guide a convolution neural network (CNN) model to the areas of wires in X-ray images.
A lightweight foundation model can be created to detect multiple objects simultaneously with higher precision and real-time speed.
arXiv Detail & Related papers (2025-03-08T12:20:22Z) - CathFlow: Self-Supervised Segmentation of Catheters in Interventional Ultrasound Using Optical Flow and Transformers [66.15847237150909]
We introduce a self-supervised deep learning architecture to segment catheters in longitudinal ultrasound images.
The network architecture builds upon AiAReSeg, a segmentation transformer built with the Attention in Attention mechanism.
We validated our model on a test dataset, consisting of unseen synthetic data and images collected from silicon aorta phantoms.
arXiv Detail & Related papers (2024-03-21T15:13:36Z) - AiAReSeg: Catheter Detection and Segmentation in Interventional
Ultrasound using Transformers [75.20925220246689]
endovascular surgeries are performed using the golden standard of Fluoroscopy, which uses ionising radiation to visualise catheters and vasculature.
This work proposes a solution using an adaptation of a state-of-the-art machine learning transformer architecture to detect and segment catheters in axial interventional Ultrasound image sequences.
arXiv Detail & Related papers (2023-09-25T19:34:12Z) - Medical needle tip tracking based on Optical Imaging and AI [0.0]
This paper presents an innovative technology for needle tip real-time tracking, aiming for enhanced needle insertion guidance.
Specifically, our approach revolves around the creation of scattering imaging using an optical fiber-equipped needle, and uses Convolutional Neural Network (CNN) based algorithms to enable real-time estimation of the needle tip's position and orientation.
Given the average femoral arterial radius of 4 to 5mm, the proposed system is demonstrated with a great potential for precise needle guidance in femoral artery insertion procedures.
arXiv Detail & Related papers (2023-08-28T10:30:08Z) - Automatic registration with continuous pose updates for marker-less
surgical navigation in spine surgery [52.63271687382495]
We present an approach that automatically solves the registration problem for lumbar spinal fusion surgery in a radiation-free manner.
A deep neural network was trained to segment the lumbar spine and simultaneously predict its orientation, yielding an initial pose for preoperative models.
An intuitive surgical guidance is provided thanks to the integration into an augmented reality based navigation system.
arXiv Detail & Related papers (2023-08-05T16:26:41Z) - Detecting the Sensing Area of A Laparoscopic Probe in Minimally Invasive
Cancer Surgery [6.0097646269887965]
In surgical oncology, it is challenging for surgeons to identify lymph nodes and completely resect cancer.
A novel tethered laparoscopic gamma detector is used to localize a preoperatively injected radiotracer.
Gamma activity visualization is challenging to present to the operator because the probe is non-imaging and it does not visibly indicate the activity on the tissue surface.
arXiv Detail & Related papers (2023-07-07T15:33:49Z) - Follow the Curve: Robotic-Ultrasound Navigation with Learning Based
Localization of Spinous Processes for Scoliosis Assessment [1.7594269512136405]
This paper introduces a robotic-ultrasound approach for spinal curvature tracking and automatic navigation.
A fully connected network with deconvolutional heads is developed to locate the spinous process efficiently with real-time ultrasound images.
We developed a new force-driven controller that automatically adjusts the probe's pose relative to the skin surface to ensure a good acoustic coupling between the probe and skin.
arXiv Detail & Related papers (2021-09-11T06:25:30Z) - Assisted Probe Positioning for Ultrasound Guided Radiotherapy Using
Image Sequence Classification [55.96221340756895]
Effective transperineal ultrasound image guidance in prostate external beam radiotherapy requires consistent alignment between probe and prostate at each session during patient set-up.
We demonstrate a method for ensuring accurate probe placement through joint classification of images and probe position data.
Using a multi-input multi-task algorithm, spatial coordinate data from an optically tracked ultrasound probe is combined with an image clas-sifier using a recurrent neural network to generate two sets of predictions in real-time.
The algorithm identified optimal probe alignment within a mean (standard deviation) range of 3.7$circ$ (1.2$circ$) from
arXiv Detail & Related papers (2020-10-06T13:55:02Z) - A Convolutional Approach to Vertebrae Detection and Labelling in Whole
Spine MRI [70.04389979779195]
We propose a novel convolutional method for the detection and identification of vertebrae in whole spine MRIs.
This involves using a learnt vector field to group detected vertebrae corners together into individual vertebral bodies.
We demonstrate the clinical applicability of this method, using it for automated scoliosis detection in both lumbar and whole spine MR scans.
arXiv Detail & Related papers (2020-07-06T09:37:12Z) - Force-Ultrasound Fusion: Bringing Spine Robotic-US to the Next "Level" [46.13840565802387]
A robotic arm automatically scans the volunteer's back along the spine by using force-ultrasound data to locate vertebral levels.
The occurrences of vertebral levels are visible on the force trace as peaks, which are enhanced by properly controlling the force applied by the robot on the patient back.
The fusion method is able to correctly classify 100% of the vertebral levels in the test set, while pure image and pure force-based method could only classify 80% and 90% vertebrae, respectively.
arXiv Detail & Related papers (2020-02-26T10:49:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.