Computer-aided Tumor Diagnosis in Automated Breast Ultrasound using 3D
Detection Network
- URL: http://arxiv.org/abs/2007.16133v1
- Date: Fri, 31 Jul 2020 15:25:07 GMT
- Title: Computer-aided Tumor Diagnosis in Automated Breast Ultrasound using 3D
Detection Network
- Authors: Junxiong Yu, Chaoyu Chen, Xin Yang, Yi Wang, Dan Yan, Jianxing Zhang,
Dong Ni
- Abstract summary: The efficacy of our network is verified from a collected dataset of 418 patients with 145 benign tumors and 273 malignant tumors.
Experiments show our network attains a sensitivity of 97.66% with 1.23 false positives (FPs), and has an area under the curve(AUC) value of 0.8720.
- Score: 18.31577982955252
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automated breast ultrasound (ABUS) is a new and promising imaging modality
for breast cancer detection and diagnosis, which could provide intuitive 3D
information and coronal plane information with great diagnostic value. However,
manually screening and diagnosing tumors from ABUS images is very
time-consuming and overlooks of abnormalities may happen. In this study, we
propose a novel two-stage 3D detection network for locating suspected lesion
areas and further classifying lesions as benign or malignant tumors.
Specifically, we propose a 3D detection network rather than frequently-used
segmentation network to locate lesions in ABUS images, thus our network can
make full use of the spatial context information in ABUS images. A novel
similarity loss is designed to effectively distinguish lesions from background.
Then a classification network is employed to identify the located lesions as
benign or malignant. An IoU-balanced classification loss is adopted to improve
the correlation between classification and localization task. The efficacy of
our network is verified from a collected dataset of 418 patients with 145
benign tumors and 273 malignant tumors. Experiments show our network attains a
sensitivity of 97.66% with 1.23 false positives (FPs), and has an area under
the curve(AUC) value of 0.8720.
Related papers
- Spatial-aware Attention Generative Adversarial Network for Semi-supervised Anomaly Detection in Medical Image [63.59114880750643]
We introduce a novel Spatial-aware Attention Generative Adrialversa Network (SAGAN) for one-class semi-supervised generation of health images.
SAGAN generates high-quality health images corresponding to unlabeled data, guided by the reconstruction of normal images and restoration of pseudo-anomaly images.
Extensive experiments on three medical datasets demonstrate that the proposed SAGAN outperforms the state-of-the-art methods.
arXiv Detail & Related papers (2024-05-21T15:41:34Z) - Liver Tumor Screening and Diagnosis in CT with Pixel-Lesion-Patient
Network [37.931408083443074]
Pixel-Lesion-pAtient Network (PLAN) is proposed to jointly segment and classify each lesion with improved anchor queries and a foreground-enhanced sampling loss.
PLAN achieves 95% and 96% in patient-level sensitivity and specificity.
On contrast-enhanced CT, our lesion-level detection precision, recall, and classification accuracy are 92%, 89%, and 86%, outperforming widely used CNN and transformers for lesion segmentation.
arXiv Detail & Related papers (2023-07-17T06:21:45Z) - Using Spatio-Temporal Dual-Stream Network with Self-Supervised Learning
for Lung Tumor Classification on Radial Probe Endobronchial Ultrasound Video [0.0]
During the biopsy process of lung cancer, physicians use real-time ultrasound images to find suitable lesion locations for sampling.
Previous studies have employed 2D convolutional neural networks to effectively differentiate between benign and malignant lung lesions.
This study designs an automatic diagnosis system based on a 3D neural network.
arXiv Detail & Related papers (2023-05-04T10:39:37Z) - Automated SSIM Regression for Detection and Quantification of Motion
Artefacts in Brain MR Images [54.739076152240024]
Motion artefacts in magnetic resonance brain images are a crucial issue.
The assessment of MR image quality is fundamental before proceeding with the clinical diagnosis.
An automated image quality assessment based on the structural similarity index (SSIM) regression has been proposed here.
arXiv Detail & Related papers (2022-06-14T10:16:54Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Explainable multiple abnormality classification of chest CT volumes with
AxialNet and HiResCAM [89.2175350956813]
We introduce the challenging new task of explainable multiple abnormality classification in volumetric medical images.
We propose a multiple instance learning convolutional neural network, AxialNet, that allows identification of top slices for each abnormality.
We then aim to improve the model's learning through a novel mask loss that leverages HiResCAM and 3D allowed regions.
arXiv Detail & Related papers (2021-11-24T01:14:33Z) - Universal Lesion Detection in CT Scans using Neural Network Ensembles [5.341593824515018]
A prerequisite for lesion sizing is their detection, as it promotes the downstream assessment of tumor spread.
We propose the use of state-of-the-art detection neural networks to flag suspicious lesions present in the NIH DeepLesion dataset for sizing.
We construct an ensemble of the best detection models to localize lesions for sizing with a precision of 65.17% and sensitivity of 91.67% at 4 FP per image.
arXiv Detail & Related papers (2021-11-09T00:11:01Z) - 3D RegNet: Deep Learning Model for COVID-19 Diagnosis on Chest CT Image [9.407002591446286]
A 3D-RegNet-based neural network is proposed for diagnosing the physical condition of patients with coronavirus (Covid-19) infection.
The results show that the test set of the 3D model, the result: f1 score of 0.8379 and AUC value of 0.8807 have been achieved.
arXiv Detail & Related papers (2021-07-08T18:10:07Z) - Wide & Deep neural network model for patch aggregation in CNN-based
prostate cancer detection systems [51.19354417900591]
Prostate cancer (PCa) is one of the leading causes of death among men, with almost 1.41 million new cases and around 375,000 deaths in 2020.
To perform an automatic diagnosis, prostate tissue samples are first digitized into gigapixel-resolution whole-slide images.
Small subimages called patches are extracted and predicted, obtaining a patch-level classification.
arXiv Detail & Related papers (2021-05-20T18:13:58Z) - Esophageal Tumor Segmentation in CT Images using Dilated Dense Attention
Unet (DDAUnet) [3.0929226049096217]
We present a fully automatic end-to-end esophageal tumor segmentation method based on convolutional neural networks (CNNs)
The proposed network, called Dilated Dense Attention Unet (DDAUnet), leverages spatial and channel attention in each dense block to selectively concentrate on determinant feature maps and regions.
arXiv Detail & Related papers (2020-12-06T11:42:52Z) - ElixirNet: Relation-aware Network Architecture Adaptation for Medical
Lesion Detection [90.13718478362337]
We introduce a novel ElixirNet that includes three components: 1) TruncatedRPN balances positive and negative data for false positive reduction; 2) Auto-lesion Block is automatically customized for medical images to incorporate relation-aware operations among region proposals; and 3) Relation transfer module incorporates the semantic relationship.
Experiments on DeepLesion and Kits19 prove the effectiveness of ElixirNet, achieving improvement of both sensitivity and precision over FPN with fewer parameters.
arXiv Detail & Related papers (2020-03-03T05:29:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.