Learning Autonomous Ultrasound via Latent Task Representation and
Robotic Skills Adaptation
- URL: http://arxiv.org/abs/2307.13323v1
- Date: Tue, 25 Jul 2023 08:32:36 GMT
- Title: Learning Autonomous Ultrasound via Latent Task Representation and
Robotic Skills Adaptation
- Authors: Xutian Deng, Junnan Jiang, Wen Cheng and Miao Li
- Abstract summary: We propose the latent task representation and the robotic skills adaptation for autonomous ultrasound in this paper.
During the offline stage, the multimodal ultrasound skills are merged and encapsulated into a low-dimensional probability model.
During the online stage, the probability model will select and evaluate the optimal prediction.
- Score: 2.3830437836694185
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As medical ultrasound is becoming a prevailing examination approach nowadays,
robotic ultrasound systems can facilitate the scanning process and prevent
professional sonographers from repetitive and tedious work. Despite the recent
progress, it is still a challenge to enable robots to autonomously accomplish
the ultrasound examination, which is largely due to the lack of a proper task
representation method, and also an adaptation approach to generalize learned
skills across different patients. To solve these problems, we propose the
latent task representation and the robotic skills adaptation for autonomous
ultrasound in this paper. During the offline stage, the multimodal ultrasound
skills are merged and encapsulated into a low-dimensional probability model
through a fully self-supervised framework, which takes clinically demonstrated
ultrasound images, probe orientations, and contact forces into account. During
the online stage, the probability model will select and evaluate the optimal
prediction. For unstable singularities, the adaptive optimizer fine-tunes them
to near and stable predictions in high-confidence regions. Experimental results
show that the proposed approach can generate complex ultrasound strategies for
diverse populations and achieve significantly better quantitative results than
our previous method.
Related papers
- Transforming Surgical Interventions with Embodied Intelligence for Ultrasound Robotics [24.014073238400137]
This paper introduces a novel Ultrasound Embodied Intelligence system that combines ultrasound robots with large language models (LLMs) and domain-specific knowledge augmentation.
Our approach employs a dual strategy: firstly, integrating LLMs with ultrasound robots to interpret doctors' verbal instructions into precise motion planning.
Our findings suggest that the proposed system improves the efficiency and quality of ultrasound scans and paves the way for further advancements in autonomous medical scanning technologies.
arXiv Detail & Related papers (2024-06-18T14:22:16Z) - Enhancing Surgical Robots with Embodied Intelligence for Autonomous Ultrasound Scanning [24.014073238400137]
Ultrasound robots are increasingly used in medical diagnostics and early disease screening.
Current ultrasound robots lack the intelligence to understand human intentions and instructions.
We propose a novel Ultrasound Embodied Intelligence system that equips ultrasound robots with the large language model and domain knowledge.
arXiv Detail & Related papers (2024-05-01T11:39:38Z) - CathFlow: Self-Supervised Segmentation of Catheters in Interventional Ultrasound Using Optical Flow and Transformers [66.15847237150909]
We introduce a self-supervised deep learning architecture to segment catheters in longitudinal ultrasound images.
The network architecture builds upon AiAReSeg, a segmentation transformer built with the Attention in Attention mechanism.
We validated our model on a test dataset, consisting of unseen synthetic data and images collected from silicon aorta phantoms.
arXiv Detail & Related papers (2024-03-21T15:13:36Z) - Validating polyp and instrument segmentation methods in colonoscopy through Medico 2020 and MedAI 2021 Challenges [58.32937972322058]
"Medico automatic polyp segmentation (Medico 2020)" and "MedAI: Transparency in Medical Image (MedAI 2021)" competitions.
We present a comprehensive summary and analyze each contribution, highlight the strength of the best-performing methods, and discuss the possibility of clinical translations of such methods into the clinic.
arXiv Detail & Related papers (2023-07-30T16:08:45Z) - Towards a Simple Framework of Skill Transfer Learning for Robotic
Ultrasound-guidance Procedures [0.0]
We briefly review challenges in skill transfer learning for robotic ultrasound-guidance procedures.
We propose a simple framework of skill transfer learning for real-time applications in robotic ultrasound-guidance procedures.
arXiv Detail & Related papers (2023-05-06T10:37:13Z) - Robotic Navigation Autonomy for Subretinal Injection via Intelligent
Real-Time Virtual iOCT Volume Slicing [88.99939660183881]
We propose a framework for autonomous robotic navigation for subretinal injection.
Our method consists of an instrument pose estimation method, an online registration between the robotic and the i OCT system, and trajectory planning tailored for navigation to an injection target.
Our experiments on ex-vivo porcine eyes demonstrate the precision and repeatability of the method.
arXiv Detail & Related papers (2023-01-17T21:41:21Z) - Localizing Scan Targets from Human Pose for Autonomous Lung Ultrasound
Imaging [61.60067283680348]
With the advent of COVID-19 global pandemic, there is a need to fully automate ultrasound imaging.
We propose a vision-based, data driven method that incorporates learning-based computer vision techniques.
Our method attains an accuracy level of 15.52 (9.47) mm for probe positioning and 4.32 (3.69)deg for probe orientation, with a success rate above 80% under an error threshold of 25mm for all scan targets.
arXiv Detail & Related papers (2022-12-15T14:34:12Z) - Ultrasound Signal Processing: From Models to Deep Learning [64.56774869055826]
Medical ultrasound imaging relies heavily on high-quality signal processing to provide reliable and interpretable image reconstructions.
Deep learning based methods, which are optimized in a data-driven fashion, have gained popularity.
A relatively new paradigm combines the power of the two: leveraging data-driven deep learning, as well as exploiting domain knowledge.
arXiv Detail & Related papers (2022-04-09T13:04:36Z) - Learning Ultrasound Scanning Skills from Human Demonstrations [6.971573270058377]
We propose a learning-based framework to acquire ultrasound scanning skills from human demonstrations.
The parameters of the model are learned using the data collected from skilled sonographers' demonstrations.
The robustness of the proposed framework is validated with the experiments on real data from sonographers.
arXiv Detail & Related papers (2021-11-09T12:29:25Z) - Learning Robotic Ultrasound Scanning Skills via Human Demonstrations and
Guided Explorations [12.894853456160924]
We propose a learning-based approach to learn the robotic ultrasound scanning skills from human demonstrations.
First, the robotic ultrasound scanning skill is encapsulated into a high-dimensional multi-modal model, which takes the ultrasound images, the pose/position of the probe and the contact force into account.
Second, we leverage the power of imitation learning to train the multi-modal model with the training data collected from the demonstrations of experienced ultrasound physicians.
arXiv Detail & Related papers (2021-11-02T14:38:09Z) - Deep Learning for Ultrasound Beamforming [120.12255978513912]
Beamforming, the process of mapping received ultrasound echoes to the spatial image domain, lies at the heart of the ultrasound image formation chain.
Modern ultrasound imaging leans heavily on innovations in powerful digital receive channel processing.
Deep learning methods can play a compelling role in the digital beamforming pipeline.
arXiv Detail & Related papers (2021-09-23T15:15:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.