A2DMN: Anatomy-Aware Dilated Multiscale Network for Breast Ultrasound Semantic Segmentation
- URL: http://arxiv.org/abs/2403.15560v1
- Date: Fri, 22 Mar 2024 18:31:24 GMT
- Title: A2DMN: Anatomy-Aware Dilated Multiscale Network for Breast Ultrasound Semantic Segmentation
- Authors: Kyle Lucke, Aleksandar Vakanski, Min Xian,
- Abstract summary: We propose a novel breast anatomy-aware network for capturing fine image details and a new smoothness term that encodes breast anatomy.
It incorporates context information across multiple spatial scales to generate more accurate semantic boundaries.
The results demonstrate the proposed method significantly improves the segmentation of the muscle, mammary, and tumor classes.
- Score: 44.99833362998488
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, convolutional neural networks for semantic segmentation of breast ultrasound (BUS) images have shown great success; however, two major challenges still exist. 1) Most current approaches inherently lack the ability to utilize tissue anatomy, resulting in misclassified image regions. 2) They struggle to produce accurate boundaries due to the repeated down-sampling operations. To address these issues, we propose a novel breast anatomy-aware network for capturing fine image details and a new smoothness term that encodes breast anatomy. It incorporates context information across multiple spatial scales to generate more accurate semantic boundaries. Extensive experiments are conducted to compare the proposed method and eight state-of-the-art approaches using a BUS dataset with 325 images. The results demonstrate the proposed method significantly improves the segmentation of the muscle, mammary, and tumor classes and produces more accurate fine details of tissue boundaries.
Related papers
- Abdominal Multi-Organ Segmentation Based on Feature Pyramid Network and
Spatial Recurrent Neural Network [2.8391355909797644]
We propose a new image segmentation model combining Feature Pyramid Network (FPN) and Spatial Recurrent Neural Network (SRNN)
We discuss why we use FPN to extract anatomical structures of different scales and how SRNN is implemented to extract the spatial context features in abdominal ultrasound images.
arXiv Detail & Related papers (2023-08-29T09:13:24Z) - Scale-aware Super-resolution Network with Dual Affinity Learning for
Lesion Segmentation from Medical Images [50.76668288066681]
We present a scale-aware super-resolution network to adaptively segment lesions of various sizes from low-resolution medical images.
Our proposed network achieved consistent improvements compared to other state-of-the-art methods.
arXiv Detail & Related papers (2023-05-30T14:25:55Z) - BAGNet: Bidirectional Aware Guidance Network for Malignant Breast
lesions Segmentation [5.823080777200961]
The bidirectional aware guidance network (BAGNet) is proposed to segment the malignant lesion from breast ultrasound images.
BAGNet captures the context between global (low-level) and local (high-level) features from the input coarse saliency map.
The introduction of the global feature map can reduce the interference of surrounding tissue (background) on the lesion regions.
arXiv Detail & Related papers (2022-04-28T08:28:06Z) - Harmonizing Pathological and Normal Pixels for Pseudo-healthy Synthesis [68.5287824124996]
We present a new type of discriminator, the segmentor, to accurately locate the lesions and improve the visual quality of pseudo-healthy images.
We apply the generated images into medical image enhancement and utilize the enhanced results to cope with the low contrast problem.
Comprehensive experiments on the T2 modality of BraTS demonstrate that the proposed method substantially outperforms the state-of-the-art methods.
arXiv Detail & Related papers (2022-03-29T08:41:17Z) - Voice-assisted Image Labelling for Endoscopic Ultrasound Classification
using Neural Networks [48.732863591145964]
We propose a multi-modal convolutional neural network architecture that labels endoscopic ultrasound (EUS) images from raw verbal comments provided by a clinician during the procedure.
Our results show a prediction accuracy of 76% at image level on a dataset with 5 different labels.
arXiv Detail & Related papers (2021-10-12T21:22:24Z) - Global Guidance Network for Breast Lesion Segmentation in Ultrasound
Images [84.03487786163781]
We develop a deep convolutional neural network equipped with a global guidance block (GGB) and breast lesion boundary detection modules.
Our network outperforms other medical image segmentation methods and the recent semantic segmentation methods on breast ultrasound lesion segmentation.
arXiv Detail & Related papers (2021-04-05T13:15:22Z) - ESTAN: Enhanced Small Tumor-Aware Network for Breast Ultrasound Image
Segmentation [0.0]
We propose a novel deep neural network architecture, namely Enhanced Small Tumor-Aware Network (ESTAN) to accurately segment breast tumors.
ESTAN introduces two encoders to extract and fuse image context information at different scales and utilizes row-column-wise kernels in the encoder to adapt to breast anatomy.
arXiv Detail & Related papers (2020-09-27T16:42:59Z) - Weakly-Supervised Segmentation for Disease Localization in Chest X-Ray
Images [0.0]
We propose a novel approach to the semantic segmentation of medical chest X-ray images with only image-level class labels as supervision.
We show that this approach is applicable to chest X-rays for detecting an anomalous volume of air between the lung and the chest wall.
arXiv Detail & Related papers (2020-07-01T20:48:35Z) - Stan: Small tumor-aware network for breast ultrasound image segmentation [68.8204255655161]
We propose a novel deep learning architecture called Small Tumor-Aware Network (STAN) to improve the performance of segmenting tumors with different size.
The proposed approach outperformed the state-of-the-art approaches in segmenting small breast tumors.
arXiv Detail & Related papers (2020-02-03T22:25:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.