Convolutional neural network based on transfer learning for breast
cancer screening
- URL: http://arxiv.org/abs/2112.11629v1
- Date: Wed, 22 Dec 2021 02:27:12 GMT
- Title: Convolutional neural network based on transfer learning for breast
cancer screening
- Authors: Hussin Ragb, Redha Ali, Elforjani Jera, and Nagi Buaossa
- Abstract summary: In this paper, a deep convolutional neural network-based algorithm is proposed to aid in accurately identifying breast cancer from ultrasonic images.
Several experiments were conducted on the breast ultrasound dataset consisting of 537 Benign, 360 malignant, and 133 normal images.
Using k-fold cross-validation and a bagging ensemble, we achieved an accuracy of 99.5% and a sensitivity of 99.6%.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Breast cancer is the most common cancer in the world and the most prevalent
cause of death among women worldwide. Nevertheless, it is also one of the most
treatable malignancies if detected early. In this paper, a deep convolutional
neural network-based algorithm is proposed to aid in accurately identifying
breast cancer from ultrasonic images. In this algorithm, several neural
networks are fused in a parallel architecture to perform the classification
process and the voting criteria are applied in the final classification
decision between the candidate object classes where the output of each neural
network is representing a single vote. Several experiments were conducted on
the breast ultrasound dataset consisting of 537 Benign, 360 malignant, and 133
normal images. These experiments show an optimistic result and a capability of
the proposed model to outperform many state-of-the-art algorithms on several
measures. Using k-fold cross-validation and a bagging classifier ensemble, we
achieved an accuracy of 99.5% and a sensitivity of 99.6%.
Related papers
- Brain Tumor Classification on MRI in Light of Molecular Markers [61.77272414423481]
Co-deletion of the 1p/19q gene is associated with clinical outcomes in low-grade gliomas.
This study aims to utilize a specially MRI-based convolutional neural network for brain cancer detection.
arXiv Detail & Related papers (2024-09-29T07:04:26Z) - Convolutional neural network classification of cancer cytopathology images: taking breast cancer as an example [40.3927727959038]
This paper introduces an approach utilizing convolutional neural networks (CNNs) for the rapid categorization of pathological images.
It enables the rapid and automatic classification of pathological images into benign and malignant groups.
It demonstrates that the proposed method effectively enhances the accuracy in classifying pathological images of breast cancer.
arXiv Detail & Related papers (2024-04-12T07:08:05Z) - Multi-task Explainable Skin Lesion Classification [54.76511683427566]
We propose a few-shot-based approach for skin lesions that generalizes well with few labelled data.
The proposed approach comprises a fusion of a segmentation network that acts as an attention module and classification network.
arXiv Detail & Related papers (2023-10-11T05:49:47Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Wide & Deep neural network model for patch aggregation in CNN-based
prostate cancer detection systems [51.19354417900591]
Prostate cancer (PCa) is one of the leading causes of death among men, with almost 1.41 million new cases and around 375,000 deaths in 2020.
To perform an automatic diagnosis, prostate tissue samples are first digitized into gigapixel-resolution whole-slide images.
Small subimages called patches are extracted and predicted, obtaining a patch-level classification.
arXiv Detail & Related papers (2021-05-20T18:13:58Z) - DenseNet for Breast Tumor Classification in Mammographic Images [0.0]
The aim of this study is to build a deep convolutional neural network method for automatic detection, segmentation, and classification of breast lesions in mammography images.
Based on deep learning the Mask-CNN (RoIAlign) method was developed to features selection and extraction; and the classification was carried out by DenseNet architecture.
arXiv Detail & Related papers (2021-01-24T03:30:59Z) - Metastatic Cancer Image Classification Based On Deep Learning Method [7.832709940526033]
We propose a noval method which combines the deep learning algorithm in image classification, the DenseNet169 framework and Rectified Adam optimization algorithm.
Our model achieves superior performance over the other classical convolutional neural networks approaches, such as Vgg19, Resnet34, Resnet50.
arXiv Detail & Related papers (2020-11-13T16:04:39Z) - The efficiency of deep learning algorithms for detecting anatomical
reference points on radiological images of the head profile [55.41644538483948]
A U-Net neural network allows performing the detection of anatomical reference points more accurately than a fully convolutional neural network.
The results of the detection of anatomical reference points by the U-Net neural network are closer to the average results of the detection of reference points by a group of orthodontists.
arXiv Detail & Related papers (2020-05-25T13:51:03Z) - Y-Net for Chest X-Ray Preprocessing: Simultaneous Classification of
Geometry and Segmentation of Annotations [70.0118756144807]
This work introduces a general pre-processing step for chest x-ray input into machine learning algorithms.
A modified Y-Net architecture based on the VGG11 encoder is used to simultaneously learn geometric orientation and segmentation of radiographs.
Results were evaluated by expert clinicians, with acceptable geometry in 95.8% and annotation mask in 96.2%, compared to 27.0% and 34.9% respectively in control images.
arXiv Detail & Related papers (2020-05-08T02:16:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.