BreastSAM: A Study of Segment Anything Model for Breast Tumor Detection
in Ultrasound Images
- URL: http://arxiv.org/abs/2305.12447v1
- Date: Sun, 21 May 2023 12:40:25 GMT
- Title: BreastSAM: A Study of Segment Anything Model for Breast Tumor Detection
in Ultrasound Images
- Authors: Mingzhe Hu, Yuheng Li, Xiaofeng Yang
- Abstract summary: We investigated the Segment Anything Model (SAM) for the task of interactive segmentation of breast tumors in ultrasound images.
We explored three pre-trained model variants: ViT_h, ViT_l, and ViT_b, among which ViT_l demonstrated superior performance in terms of mean pixel accuracy, Dice score, and IoU score.
The study further evaluated the model's differential performance in segmenting malignant and benign breast tumors, with the model showing exceptional proficiency in both categories.
- Score: 2.752682633344525
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Breast cancer is one of the most common cancers among women worldwide, with
early detection significantly increasing survival rates. Ultrasound imaging is
a critical diagnostic tool that aids in early detection by providing real-time
imaging of the breast tissue. We conducted a thorough investigation of the
Segment Anything Model (SAM) for the task of interactive segmentation of breast
tumors in ultrasound images. We explored three pre-trained model variants:
ViT_h, ViT_l, and ViT_b, among which ViT_l demonstrated superior performance in
terms of mean pixel accuracy, Dice score, and IoU score. The significance of
prompt interaction in improving the model's segmentation performance was also
highlighted, with substantial improvements in performance metrics when prompts
were incorporated. The study further evaluated the model's differential
performance in segmenting malignant and benign breast tumors, with the model
showing exceptional proficiency in both categories, albeit with slightly better
performance for benign tumors. Furthermore, we analyzed the impacts of various
breast tumor characteristics - size, contrast, aspect ratio, and complexity -
on segmentation performance. Our findings reveal that tumor contrast and size
positively impact the segmentation result, while complex boundaries pose
challenges. The study provides valuable insights for using SAM as a robust and
effective algorithm for breast tumor segmentation in ultrasound images.
Related papers
- Mask-Enhanced Segment Anything Model for Tumor Lesion Semantic Segmentation [48.107348956719775]
We introduce Mask-Enhanced SAM (M-SAM), an innovative architecture tailored for 3D tumor lesion segmentation.
We propose a novel Mask-Enhanced Adapter (MEA) within M-SAM that enriches the semantic information of medical images with positional data from coarse segmentation masks.
Our M-SAM achieves high segmentation accuracy and also exhibits robust generalization.
arXiv Detail & Related papers (2024-03-09T13:37:02Z) - LightBTSeg: A lightweight breast tumor segmentation model using
ultrasound images via dual-path joint knowledge distillation [1.9355072302703609]
We propose LightBTSeg, a dual-path joint knowledge distillation framework, for lightweight breast tumor segmentation.
We leverage the bottleneck architecture to reconstruct the original Attention U-Net.
Then, the prior knowledge of benign and malignant categories is utilized to design the teacher network combined dual-path joint knowledge distillation.
arXiv Detail & Related papers (2023-11-18T14:25:40Z) - Comparative Analysis of Segment Anything Model and U-Net for Breast
Tumor Detection in Ultrasound and Mammography Images [0.15833270109954137]
The technique employs two advanced deep learning architectures, namely U-Net and pretrained SAM, for tumor segmentation.
The U-Net model is specifically designed for medical image segmentation.
The pretrained SAM architecture incorporates a mechanism to capture spatial dependencies and generate segmentation results.
arXiv Detail & Related papers (2023-06-21T18:49:21Z) - High-resolution synthesis of high-density breast mammograms: Application
to improved fairness in deep learning based mass detection [48.88813637974911]
Computer-aided detection systems based on deep learning have shown good performance in breast cancer detection.
High-density breasts show poorer detection performance since dense tissues can mask or even simulate masses.
This study aims to improve the mass detection performance in high-density breasts using synthetic high-density full-field digital mammograms.
arXiv Detail & Related papers (2022-09-20T15:57:12Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - RCA-IUnet: A residual cross-spatial attention guided inception U-Net
model for tumor segmentation in breast ultrasound imaging [0.6091702876917281]
The article introduces an efficient residual cross-spatial attention guided inception U-Net (RCA-IUnet) model with minimal training parameters for tumor segmentation.
The RCA-IUnet model follows U-Net topology with residual inception depth-wise separable convolution and hybrid pooling layers.
Cross-spatial attention filters are added to suppress the irrelevant features and focus on the target structure.
arXiv Detail & Related papers (2021-08-05T10:35:06Z) - Learned super resolution ultrasound for improved breast lesion
characterization [52.77024349608834]
Super resolution ultrasound localization microscopy enables imaging of the microvasculature at the capillary level.
In this work we use a deep neural network architecture that makes effective use of signal structure to address these challenges.
By leveraging our trained network, the microvasculature structure is recovered in a short time, without prior PSF knowledge, and without requiring separability of the UCAs.
arXiv Detail & Related papers (2021-07-12T09:04:20Z) - ESTAN: Enhanced Small Tumor-Aware Network for Breast Ultrasound Image
Segmentation [0.0]
We propose a novel deep neural network architecture, namely Enhanced Small Tumor-Aware Network (ESTAN) to accurately segment breast tumors.
ESTAN introduces two encoders to extract and fuse image context information at different scales and utilizes row-column-wise kernels in the encoder to adapt to breast anatomy.
arXiv Detail & Related papers (2020-09-27T16:42:59Z) - Robust Pancreatic Ductal Adenocarcinoma Segmentation with
Multi-Institutional Multi-Phase Partially-Annotated CT Scans [25.889684822655255]
Pancreatic ductal adenocarcinoma (PDAC) segmentation is one of the most challenging tumor segmentation tasks.
Based on a new self-learning framework, we propose to train the PDAC segmentation model using a much larger quantity of patients.
Experiment results show that our proposed method provides an absolute improvement of 6.3% Dice score over the strong baseline of nnUNet trained on annotated images.
arXiv Detail & Related papers (2020-08-24T18:50:30Z) - Spectral-Spatial Recurrent-Convolutional Networks for In-Vivo
Hyperspectral Tumor Type Classification [49.32653090178743]
We demonstrate the feasibility of in-vivo tumor type classification using hyperspectral imaging and deep learning.
Our best model achieves an AUC of 76.3%, significantly outperforming previous conventional and deep learning methods.
arXiv Detail & Related papers (2020-07-02T12:00:53Z) - Stan: Small tumor-aware network for breast ultrasound image segmentation [68.8204255655161]
We propose a novel deep learning architecture called Small Tumor-Aware Network (STAN) to improve the performance of segmenting tumors with different size.
The proposed approach outperformed the state-of-the-art approaches in segmenting small breast tumors.
arXiv Detail & Related papers (2020-02-03T22:25:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.