DUSTrack: Semi-automated point tracking in ultrasound videos
- URL: http://arxiv.org/abs/2507.14368v1
- Date: Fri, 18 Jul 2025 21:22:39 GMT
- Title: DUSTrack: Semi-automated point tracking in ultrasound videos
- Authors: Praneeth Namburi, Roger Pallarès-López, Jessica Rosendorf, Duarte Folgado, Brian W. Anthony,
- Abstract summary: This manuscript introduces DUSTrack, a semi-automated framework for tracking arbitrary points in B-mode ultrasound videos.<n>We combine deep learning and optical flow to deliver high-quality and robust tracking across diverse anatomical structures and motion patterns.<n>As an open-source solution, DUSTrack offers a powerful, flexible framework for point tracking to quantify tissue motion from ultrasound videos.
- Score: 0.559239450391449
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Ultrasound technology enables safe, non-invasive imaging of dynamic tissue behavior, making it a valuable tool in medicine, biomechanics, and sports science. However, accurately tracking tissue motion in B-mode ultrasound remains challenging due to speckle noise, low edge contrast, and out-of-plane movement. These challenges complicate the task of tracking anatomical landmarks over time, which is essential for quantifying tissue dynamics in many clinical and research applications. This manuscript introduces DUSTrack (Deep learning and optical flow-based toolkit for UltraSound Tracking), a semi-automated framework for tracking arbitrary points in B-mode ultrasound videos. We combine deep learning with optical flow to deliver high-quality and robust tracking across diverse anatomical structures and motion patterns. The toolkit includes a graphical user interface that streamlines the generation of high-quality training data and supports iterative model refinement. It also implements a novel optical-flow-based filtering technique that reduces high-frequency frame-to-frame noise while preserving rapid tissue motion. DUSTrack demonstrates superior accuracy compared to contemporary zero-shot point trackers and performs on par with specialized methods, establishing its potential as a general and foundational tool for clinical and biomechanical research. We demonstrate DUSTrack's versatility through three use cases: cardiac wall motion tracking in echocardiograms, muscle deformation analysis during reaching tasks, and fascicle tracking during ankle plantarflexion. As an open-source solution, DUSTrack offers a powerful, flexible framework for point tracking to quantify tissue motion from ultrasound videos. DUSTrack is available at https://github.com/praneethnamburi/DUSTrack.
Related papers
- Taming Modern Point Tracking for Speckle Tracking Echocardiography via Impartial Motion [0.686108371431346]
This work investigates the potential state-of-the-art point tracking methods for ultrasound, with a focus on echocardiography.<n>By analyzing cardiac motion throughout the heart cycle in real B-mode ultrasound videos, we identify that a directional motion bias is affecting the existing training strategies.<n>We incorporate a set of tailored augmentations to reduce the bias and enhance tracking generalization and robustness through impartial cardiac motion.
arXiv Detail & Related papers (2025-07-14T10:18:26Z) - Efficient Motion Prompt Learning for Robust Visual Tracking [58.59714916705317]
We propose a lightweight and plug-and-play motion prompt tracking method.<n>It can be easily integrated into existing vision-based trackers to build a joint tracking framework.<n>Experiments on seven tracking benchmarks demonstrate that the proposed motion module significantly improves the robustness of vision-based trackers.
arXiv Detail & Related papers (2025-05-22T07:22:58Z) - EchoWorld: Learning Motion-Aware World Models for Echocardiography Probe Guidance [79.66329903007869]
We present EchoWorld, a motion-aware world modeling framework for probe guidance.<n>It encodes anatomical knowledge and motion-induced visual dynamics.<n>It is trained on more than one million ultrasound images from over 200 routine scans.
arXiv Detail & Related papers (2025-04-17T16:19:05Z) - LiteTracker: Leveraging Temporal Causality for Accurate Low-latency Tissue Tracking [84.52765560227917]
LiteTracker is a low-latency method for tissue tracking in endoscopic video streams.<n> LiteTracker builds on a state-of-the-art long-term point tracking method, and introduces a set of training-free runtime optimizations.
arXiv Detail & Related papers (2025-04-14T05:53:57Z) - MambaXCTrack: Mamba-based Tracker with SSM Cross-correlation and Motion Prompt for Ultrasound Needle Tracking [8.559434917518935]
A Mamba-based US needle tracker MambaXCTrack is proposed to provide feedback on the needle tip position via US imaging.<n>The proposed tracker outperforms other state-of-the-art trackers while ablation studies further highlight the effectiveness of each proposed tracking module.
arXiv Detail & Related papers (2024-11-13T07:27:56Z) - EchoTracker: Advancing Myocardial Point Tracking in Echocardiography [0.6263680699548959]
EchoTracker is a two-fold coarse-to-fine model that facilitates the tracking of queried points on a tissue surface across ultrasound image sequences.
Experiments demonstrate that the model outperforms SOTA methods, with an average position accuracy of 67% and a median trajectory error of 2.86 pixels.
This implies that learning-based point tracking can potentially improve performance and yield a higher diagnostic and prognostic value for clinical measurements.
arXiv Detail & Related papers (2024-05-14T13:24:51Z) - CathFlow: Self-Supervised Segmentation of Catheters in Interventional Ultrasound Using Optical Flow and Transformers [66.15847237150909]
We introduce a self-supervised deep learning architecture to segment catheters in longitudinal ultrasound images.
The network architecture builds upon AiAReSeg, a segmentation transformer built with the Attention in Attention mechanism.
We validated our model on a test dataset, consisting of unseen synthetic data and images collected from silicon aorta phantoms.
arXiv Detail & Related papers (2024-03-21T15:13:36Z) - Motion-Guided Dual-Camera Tracker for Endoscope Tracking and Motion Analysis in a Mechanical Gastric Simulator [5.073179848641095]
Motion-guided dual-camera vision tracker is proposed to provide robust and accurate tracking of the endoscope tip's 3D position.<n>The proposed tracker achieves superior performance against state-of-the-art vision trackers, achieving 42% and 72% improvements against the second-best method in average error and maximum error.
arXiv Detail & Related papers (2024-03-08T08:31:46Z) - AiAReSeg: Catheter Detection and Segmentation in Interventional
Ultrasound using Transformers [75.20925220246689]
endovascular surgeries are performed using the golden standard of Fluoroscopy, which uses ionising radiation to visualise catheters and vasculature.
This work proposes a solution using an adaptation of a state-of-the-art machine learning transformer architecture to detect and segment catheters in axial interventional Ultrasound image sequences.
arXiv Detail & Related papers (2023-09-25T19:34:12Z) - Deep Learning for Ultrasound Beamforming [120.12255978513912]
Beamforming, the process of mapping received ultrasound echoes to the spatial image domain, lies at the heart of the ultrasound image formation chain.
Modern ultrasound imaging leans heavily on innovations in powerful digital receive channel processing.
Deep learning methods can play a compelling role in the digital beamforming pipeline.
arXiv Detail & Related papers (2021-09-23T15:15:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.