Multilevel Perception Boundary-guided Network for Breast Lesion
Segmentation in Ultrasound Images
- URL: http://arxiv.org/abs/2310.14636v1
- Date: Mon, 23 Oct 2023 07:21:02 GMT
- Title: Multilevel Perception Boundary-guided Network for Breast Lesion
Segmentation in Ultrasound Images
- Authors: Xing Yang, Jian Zhang, Qijian Chen, Li Wang and Lihui Wang
- Abstract summary: We propose a PBNet composed by a multilevel global perception module (MGPM) and a boundary guided module (BGM) to segment breast tumors from ultrasound images.
In MGPM, the long-range spatial dependence between the voxels in a single level feature maps are modeled, and then the multilevel semantic information is fused.
In BGM, the tumor boundaries are extracted from the high-level semantic maps using the dilation and erosion effects of max pooling.
- Score: 9.252383213566947
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automatic segmentation of breast tumors from the ultrasound images is
essential for the subsequent clinical diagnosis and treatment plan. Although
the existing deep learning-based methods have achieved significant progress in
automatic segmentation of breast tumor, their performance on tumors with
similar intensity to the normal tissues is still not pleasant, especially for
the tumor boundaries. To address this issue, we propose a PBNet composed by a
multilevel global perception module (MGPM) and a boundary guided module (BGM)
to segment breast tumors from ultrasound images. Specifically, in MGPM, the
long-range spatial dependence between the voxels in a single level feature maps
are modeled, and then the multilevel semantic information is fused to promote
the recognition ability of the model for non-enhanced tumors. In BGM, the tumor
boundaries are extracted from the high-level semantic maps using the dilation
and erosion effects of max pooling, such boundaries are then used to guide the
fusion of low and high-level features. Moreover, to improve the segmentation
performance for tumor boundaries, a multi-level boundary-enhanced segmentation
(BS) loss is proposed. The extensive comparison experiments on both publicly
available dataset and in-house dataset demonstrate that the proposed PBNet
outperforms the state-of-the-art methods in terms of both qualitative
visualization results and quantitative evaluation metrics, with the Dice score,
Jaccard coefficient, Specificity and HD95 improved by 0.70%, 1.1%, 0.1% and
2.5% respectively. In addition, the ablation experiments validate that the
proposed MGPM is indeed beneficial for distinguishing the non-enhanced tumors
and the BGM as well as the BS loss are also helpful for refining the
segmentation contours of the tumor.
Related papers
- Towards a Benchmark for Colorectal Cancer Segmentation in Endorectal Ultrasound Videos: Dataset and Model Development [59.74920439478643]
In this paper, we collect and annotated the first benchmark dataset that covers diverse ERUS scenarios.
Our ERUS-10K dataset comprises 77 videos and 10,000 high-resolution annotated frames.
We introduce a benchmark model for colorectal cancer segmentation, named the Adaptive Sparse-context TRansformer (ASTR)
arXiv Detail & Related papers (2024-08-19T15:04:42Z) - Lumbar Spine Tumor Segmentation and Localization in T2 MRI Images Using AI [2.9746083684997418]
This study introduces a novel data augmentation technique, aimed at automating spine tumor segmentation and localization through AI approaches.
A Convolutional Neural Network (CNN) architecture is employed for tumor classification. 3D vertebral segmentation and labeling techniques are used to help pinpoint the exact location of the tumors in the lumbar spine.
Results indicate a remarkable performance, with 99% accuracy for tumor segmentation, 98% accuracy for tumor classification, and 99% accuracy for tumor localization achieved with the proposed approach.
arXiv Detail & Related papers (2024-05-07T05:55:50Z) - Mask-Enhanced Segment Anything Model for Tumor Lesion Semantic Segmentation [48.107348956719775]
We introduce Mask-Enhanced SAM (M-SAM), an innovative architecture tailored for 3D tumor lesion segmentation.
We propose a novel Mask-Enhanced Adapter (MEA) within M-SAM that enriches the semantic information of medical images with positional data from coarse segmentation masks.
Our M-SAM achieves high segmentation accuracy and also exhibits robust generalization.
arXiv Detail & Related papers (2024-03-09T13:37:02Z) - Progressive Dual Priori Network for Generalized Breast Tumor Segmentation [5.003997324423131]
We propose a progressive dual priori network ( PDPNet) to segment breast tumors from DCE-MRI images.
The results show that the DSC and HD95 of PDPNet were improved at least by 5.13% and 7.58% respectively on multi-center test sets.
arXiv Detail & Related papers (2023-10-20T15:12:06Z) - Comparative Analysis of Segment Anything Model and U-Net for Breast
Tumor Detection in Ultrasound and Mammography Images [0.15833270109954137]
The technique employs two advanced deep learning architectures, namely U-Net and pretrained SAM, for tumor segmentation.
The U-Net model is specifically designed for medical image segmentation.
The pretrained SAM architecture incorporates a mechanism to capture spatial dependencies and generate segmentation results.
arXiv Detail & Related papers (2023-06-21T18:49:21Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Global Guidance Network for Breast Lesion Segmentation in Ultrasound
Images [84.03487786163781]
We develop a deep convolutional neural network equipped with a global guidance block (GGB) and breast lesion boundary detection modules.
Our network outperforms other medical image segmentation methods and the recent semantic segmentation methods on breast ultrasound lesion segmentation.
arXiv Detail & Related papers (2021-04-05T13:15:22Z) - Sequential Learning on Liver Tumor Boundary Semantics and Prognostic
Biomarker Mining [73.23533486979166]
Capsular invasion on tumor boundary has proven to be clinically correlated with the prognostic indicator, microvascular invasion (MVI)
In this paper, we propose the first and novel computational framework that disentangles the task into two components.
arXiv Detail & Related papers (2021-03-09T01:43:05Z) - ESTAN: Enhanced Small Tumor-Aware Network for Breast Ultrasound Image
Segmentation [0.0]
We propose a novel deep neural network architecture, namely Enhanced Small Tumor-Aware Network (ESTAN) to accurately segment breast tumors.
ESTAN introduces two encoders to extract and fuse image context information at different scales and utilizes row-column-wise kernels in the encoder to adapt to breast anatomy.
arXiv Detail & Related papers (2020-09-27T16:42:59Z) - Stan: Small tumor-aware network for breast ultrasound image segmentation [68.8204255655161]
We propose a novel deep learning architecture called Small Tumor-Aware Network (STAN) to improve the performance of segmenting tumors with different size.
The proposed approach outperformed the state-of-the-art approaches in segmenting small breast tumors.
arXiv Detail & Related papers (2020-02-03T22:25:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.