Joint 2D-3D Breast Cancer Classification
- URL: http://arxiv.org/abs/2002.12392v1
- Date: Thu, 27 Feb 2020 19:08:16 GMT
- Title: Joint 2D-3D Breast Cancer Classification
- Authors: Gongbo Liang, Xiaoqin Wang, Yu Zhang, Xin Xing, Hunter Blanton, Tawfiq
Salem, Nathan Jacobs
- Abstract summary: Digital mammograms (DM or 2D mammogram) and digital breast tomosynthesis (DBT or 3D mammogram) are the two types of mammography imagery used in clinical practice for breast cancer detection and diagnosis.
We propose an innovative convolutional neural network (CNN) architecture for breast cancer classification, which uses both 2D and 3D mammograms, simultaneously.
- Score: 22.031221319016353
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Breast cancer is the malignant tumor that causes the highest number of cancer
deaths in females. Digital mammograms (DM or 2D mammogram) and digital breast
tomosynthesis (DBT or 3D mammogram) are the two types of mammography imagery
that are used in clinical practice for breast cancer detection and diagnosis.
Radiologists usually read both imaging modalities in combination; however,
existing computer-aided diagnosis tools are designed using only one imaging
modality. Inspired by clinical practice, we propose an innovative convolutional
neural network (CNN) architecture for breast cancer classification, which uses
both 2D and 3D mammograms, simultaneously. Our experiment shows that the
proposed method significantly improves the performance of breast cancer
classification. By assembling three CNN classifiers, the proposed model
achieves 0.97 AUC, which is 34.72% higher than the methods using only one
imaging modality.
Related papers
- Computer Aided Detection and Classification of mammograms using Convolutional Neural Network [0.0]
Breast cancer is one of the most major causes of death among women, after lung cancer.
Deep learning or neural networks are one of the methods that can be used to distinguish regular and irregular breast identification.
CNNM dataset has been used in which nearly 460 images are of normal and 920 of abnormal breasts.
arXiv Detail & Related papers (2024-09-04T03:42:27Z) - Improving Breast Cancer Grade Prediction with Multiparametric MRI Created Using Optimized Synthetic Correlated Diffusion Imaging [71.91773485443125]
Grading plays a vital role in breast cancer treatment planning.
The current tumor grading method involves extracting tissue from patients, leading to stress, discomfort, and high medical costs.
This paper examines using optimized CDI$s$ to improve breast cancer grade prediction.
arXiv Detail & Related papers (2024-05-13T15:48:26Z) - High-resolution synthesis of high-density breast mammograms: Application
to improved fairness in deep learning based mass detection [48.88813637974911]
Computer-aided detection systems based on deep learning have shown good performance in breast cancer detection.
High-density breasts show poorer detection performance since dense tissues can mask or even simulate masses.
This study aims to improve the mass detection performance in high-density breasts using synthetic high-density full-field digital mammograms.
arXiv Detail & Related papers (2022-09-20T15:57:12Z) - Moving from 2D to 3D: volumetric medical image classification for rectal
cancer staging [62.346649719614]
preoperative discrimination between T2 and T3 stages is arguably both the most challenging and clinically significant task for rectal cancer treatment.
We present a volumetric convolutional neural network to accurately discriminate T2 from T3 stage rectal cancer with rectal MR volumes.
arXiv Detail & Related papers (2022-09-13T07:10:14Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Breast Cancer Classification Using: Pixel Interpolation [0.0]
The proposed system is implemented using programming and tested over several images taken from the Mammogram Image Analysis Society (MIAS) image database.
The system works faster so that any radiologist can take a clear decision about the appearance of calcifications by visual inspection.
arXiv Detail & Related papers (2021-11-03T16:58:17Z) - Learned super resolution ultrasound for improved breast lesion
characterization [52.77024349608834]
Super resolution ultrasound localization microscopy enables imaging of the microvasculature at the capillary level.
In this work we use a deep neural network architecture that makes effective use of signal structure to address these challenges.
By leveraging our trained network, the microvasculature structure is recovered in a short time, without prior PSF knowledge, and without requiring separability of the UCAs.
arXiv Detail & Related papers (2021-07-12T09:04:20Z) - DenseNet for Breast Tumor Classification in Mammographic Images [0.0]
The aim of this study is to build a deep convolutional neural network method for automatic detection, segmentation, and classification of breast lesions in mammography images.
Based on deep learning the Mask-CNN (RoIAlign) method was developed to features selection and extraction; and the classification was carried out by DenseNet architecture.
arXiv Detail & Related papers (2021-01-24T03:30:59Z) - Synthesizing lesions using contextual GANs improves breast cancer
classification on mammograms [0.4297070083645048]
We present a novel generative adversarial network (GAN) model for data augmentation that can realistically synthesize and remove lesions on mammograms.
With self-attention and semi-supervised learning components, the U-net-based architecture can generate high resolution (256x256px) outputs.
arXiv Detail & Related papers (2020-05-29T21:23:00Z) - Stan: Small tumor-aware network for breast ultrasound image segmentation [68.8204255655161]
We propose a novel deep learning architecture called Small Tumor-Aware Network (STAN) to improve the performance of segmenting tumors with different size.
The proposed approach outperformed the state-of-the-art approaches in segmenting small breast tumors.
arXiv Detail & Related papers (2020-02-03T22:25:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.