A novel multi-view deep learning approach for BI-RADS and density
assessment of mammograms
- URL: http://arxiv.org/abs/2112.04490v1
- Date: Wed, 8 Dec 2021 10:59:17 GMT
- Title: A novel multi-view deep learning approach for BI-RADS and density
assessment of mammograms
- Authors: Huyen T. X. Nguyen, Sam B. Tran, Dung B. Nguyen, Hieu H. Pham, Ha Q.
Nguyen
- Abstract summary: We propose a novel multi-view DL approach for BI-RADS and density assessment of mammograms.
The proposed approach first deploys deep convolutional networks for feature extraction on each view separately.
The extracted features are then stacked and fed into a Light Gradient Boosting Machine to predict BI-RADS and density scores.
- Score: 0.5039813366558306
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Advanced deep learning (DL) algorithms may predict the patient's risk of
developing breast cancer based on the Breast Imaging Reporting and Data System
(BI-RADS) and density standards. Recent studies have suggested that the
combination of multi-view analysis improved the overall breast exam
classification. In this paper, we propose a novel multi-view DL approach for
BI-RADS and density assessment of mammograms. The proposed approach first
deploys deep convolutional networks for feature extraction on each view
separately. The extracted features are then stacked and fed into a Light
Gradient Boosting Machine (LightGBM) classifier to predict BI-RADS and density
scores. We conduct extensive experiments on both the internal mammography
dataset and the public dataset Digital Database for Screening Mammography
(DDSM). The experimental results demonstrate that the proposed approach
outperforms the single-view classification approach on two benchmark datasets
by huge margins (5% on the internal dataset and 10% on the DDSM dataset). These
results highlight the vital role of combining multi-view information to improve
the performance of breast cancer risk prediction.
Related papers
- Segmentation Strategies in Deep Learning for Prostate Cancer Diagnosis: A Comparative Study of Mamba, SAM, and YOLO [0.6116681488656472]
This study presents a comparative analysis of three deep learning-based methods, Mamba, SAM, and YOLO, for segmenting prostate cancer histopathology images.
We evaluated the performance of these models on two comprehensive datasets, Gleason 2019 and SICAPv2, using Dice score, precision, and recall metrics.
The H-vmunet model's advanced architecture, which integrates high-order visual state spaces and 2D-selective-scan operations, enables efficient and sensitive lesion detection.
arXiv Detail & Related papers (2024-09-24T16:04:29Z) - Intelligent Breast Cancer Diagnosis with Heuristic-assisted
Trans-Res-U-Net and Multiscale DenseNet using Mammogram Images [0.0]
Breast cancer (BC) significantly contributes to cancer-related mortality in women.
accurately distinguishing malignant mass lesions remains challenging.
We propose a novel deep learning approach for BC screening utilizing mammography images.
arXiv Detail & Related papers (2023-10-30T10:22:14Z) - Domain Generalization for Mammographic Image Analysis with Contrastive
Learning [62.25104935889111]
The training of an efficacious deep learning model requires large data with diverse styles and qualities.
A novel contrastive learning is developed to equip the deep learning models with better style generalization capability.
The proposed method has been evaluated extensively and rigorously with mammograms from various vendor style domains and several public datasets.
arXiv Detail & Related papers (2023-04-20T11:40:21Z) - Ambiguous Medical Image Segmentation using Diffusion Models [60.378180265885945]
We introduce a single diffusion model-based approach that produces multiple plausible outputs by learning a distribution over group insights.
Our proposed model generates a distribution of segmentation masks by leveraging the inherent sampling process of diffusion.
Comprehensive results show that our proposed approach outperforms existing state-of-the-art ambiguous segmentation networks.
arXiv Detail & Related papers (2023-04-10T17:58:22Z) - Best of Both Worlds: Multimodal Contrastive Learning with Tabular and
Imaging Data [7.49320945341034]
We propose the first self-supervised contrastive learning framework to train unimodal encoders.
Our solution combines SimCLR and SCARF, two leading contrastive learning strategies.
We show the generalizability of our approach to natural images using the DVM car advertisement dataset.
arXiv Detail & Related papers (2023-03-24T15:44:42Z) - High-resolution synthesis of high-density breast mammograms: Application
to improved fairness in deep learning based mass detection [48.88813637974911]
Computer-aided detection systems based on deep learning have shown good performance in breast cancer detection.
High-density breasts show poorer detection performance since dense tissues can mask or even simulate masses.
This study aims to improve the mass detection performance in high-density breasts using synthetic high-density full-field digital mammograms.
arXiv Detail & Related papers (2022-09-20T15:57:12Z) - Superficial White Matter Analysis: An Efficient Point-cloud-based Deep
Learning Framework with Supervised Contrastive Learning for Consistent
Tractography Parcellation across Populations and dMRI Acquisitions [68.41088365582831]
White matter parcellation classifies tractography streamlines into clusters or anatomically meaningful tracts.
Most parcellation methods focus on the deep white matter (DWM), whereas fewer methods address the superficial white matter (SWM) due to its complexity.
We propose a novel two-stage deep-learning-based framework, Superficial White Matter Analysis (SupWMA), that performs an efficient parcellation of 198 SWM clusters from whole-brain tractography.
arXiv Detail & Related papers (2022-07-18T23:07:53Z) - A Novel Transparency Strategy-based Data Augmentation Approach for
BI-RADS Classification of Mammograms [0.33598755777055367]
We propose a novel transparency strategy to boost the Breast Imaging Reporting and Data System (BI-RADS) scores of mammogram classifiers.
Our experiments show that the proposed approach significantly improves the mammogram classification performance and surpasses a state-of-the-art data augmentation technique called CutMix.
This study also highlights that our transparency method is more effective than other augmentation strategies for BI-RADS classification and can be widely applied to other computer vision tasks.
arXiv Detail & Related papers (2022-03-20T17:51:38Z) - Act Like a Radiologist: Towards Reliable Multi-view Correspondence
Reasoning for Mammogram Mass Detection [49.14070210387509]
We propose an Anatomy-aware Graph convolutional Network (AGN) for mammogram mass detection.
AGN is tailored for mammogram mass detection and endows existing detection methods with multi-view reasoning ability.
Experiments on two standard benchmarks reveal that AGN significantly exceeds the state-of-the-art performance.
arXiv Detail & Related papers (2021-05-21T06:48:34Z) - Deep Semi-supervised Metric Learning with Dual Alignment for Cervical
Cancer Cell Detection [49.78612417406883]
We propose a novel semi-supervised deep metric learning method for cervical cancer cell detection.
Our model learns an embedding metric space and conducts dual alignment of semantic features on both the proposal and prototype levels.
We construct a large-scale dataset for semi-supervised cervical cancer cell detection for the first time, consisting of 240,860 cervical cell images.
arXiv Detail & Related papers (2021-04-07T17:11:27Z) - A New Computer-Aided Diagnosis System with Modified Genetic Feature
Selection for BI-RADS Classification of Breast Masses in Mammograms [5.395050211492798]
The language used to describe abnormalities in mammographic reports is based on the breast Imaging Reporting and Data System (BI-RADS)
This paper proposes a new and effective computer-aided diagnosis (CAD) system to classify mammographic masses into four assessment categories in BI-RADS.
arXiv Detail & Related papers (2020-05-11T13:06:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.