Towards Non-invasive and Personalized Management of Breast Cancer Patients from Multiparametric MRI via A Large Mixture-of-Modality-Experts Model
- URL: http://arxiv.org/abs/2408.12606v2
- Date: Mon, 2 Sep 2024 00:52:01 GMT
- Title: Towards Non-invasive and Personalized Management of Breast Cancer Patients from Multiparametric MRI via A Large Mixture-of-Modality-Experts Model
- Authors: Luyang Luo, Mingxiang Wu, Mei Li, Yi Xin, Qiong Wang, Varut Vardhanabhuti, Winnie CW Chu, Zhenhui Li, Juan Zhou, Pranav Rajpurkar, Hao Chen,
- Abstract summary: We report a mixture-of-modality-experts model (MOME) that integrates multiparametric MRI information within a unified structure.
MOME demonstrated accurate and robust identification of breast cancer.
It could reduce the need for biopsies in BI-RADS 4 patients with a ratio of 7.3%, classify triple-negative breast cancer with an AUROC of 0.709, and predict pathological complete response to neoadjuvant chemotherapy with an AUROC of 0.694.
- Score: 19.252851972152957
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Breast magnetic resonance imaging (MRI) is the imaging technique with the highest sensitivity for detecting breast cancer and is routinely used for women at high risk. Despite the comprehensive multiparametric protocol of breast MRI, existing artificial intelligence-based studies predominantly rely on single sequences and have limited validation. Here we report a large mixture-of-modality-experts model (MOME) that integrates multiparametric MRI information within a unified structure, offering a noninvasive method for personalized breast cancer management. We have curated the largest multiparametric breast MRI dataset, involving 5,205 patients from three hospitals in the north, southeast, and southwest of China, for the development and extensive evaluation of our model. MOME demonstrated accurate and robust identification of breast cancer. It achieved comparable performance for malignancy recognition to that of four senior radiologists and significantly outperformed a junior radiologist, with 0.913 AUROC, 0.948 AUPRC, 0.905 F1 score, and 0.723 MCC. Our findings suggest that MOME could reduce the need for biopsies in BI-RADS 4 patients with a ratio of 7.3%, classify triple-negative breast cancer with an AUROC of 0.709, and predict pathological complete response to neoadjuvant chemotherapy with an AUROC of 0.694. The model further supports scalable and interpretable inference, adapting to missing modalities and providing decision explanations by highlighting lesions and measuring modality contributions. MOME exemplifies a discriminative, robust, scalable, and interpretable multimodal model, paving the way for noninvasive, personalized management of breast cancer patients based on multiparametric breast imaging data.
Related papers
- Improving Breast Cancer Grade Prediction with Multiparametric MRI Created Using Optimized Synthetic Correlated Diffusion Imaging [71.91773485443125]
Grading plays a vital role in breast cancer treatment planning.
The current tumor grading method involves extracting tissue from patients, leading to stress, discomfort, and high medical costs.
This paper examines using optimized CDI$s$ to improve breast cancer grade prediction.
arXiv Detail & Related papers (2024-05-13T15:48:26Z) - Using Multiparametric MRI with Optimized Synthetic Correlated Diffusion Imaging to Enhance Breast Cancer Pathologic Complete Response Prediction [71.91773485443125]
Neoadjuvant chemotherapy has recently gained popularity as a promising treatment strategy for breast cancer.
The current process to recommend neoadjuvant chemotherapy relies on the subjective evaluation of medical experts.
This research investigates the application of optimized CDI$s$ to enhance breast cancer pathologic complete response prediction.
arXiv Detail & Related papers (2024-05-13T15:40:56Z) - Glioblastoma Tumor Segmentation using an Ensemble of Vision Transformers [0.0]
Glioblastoma is one of the most aggressive and deadliest types of brain cancer.
Brain Radiology Aided by Intelligent Neural NETworks (BRAINNET) generates robust tumor segmentation maks.
arXiv Detail & Related papers (2023-11-09T18:55:27Z) - Leveraging Transformers to Improve Breast Cancer Classification and Risk
Assessment with Multi-modal and Longitudinal Data [3.982926115291704]
Multi-modal Transformer (MMT) is a neural network that utilizes mammography and ultrasound synergistically.
MMT tracks temporal tissue changes by comparing current exams to prior imaging.
For 5-year risk prediction, MMT attains an AUROC of 0.826, outperforming prior mammography-based risk models.
arXiv Detail & Related papers (2023-11-06T16:01:42Z) - Cancer-Net BCa-S: Breast Cancer Grade Prediction using Volumetric Deep
Radiomic Features from Synthetic Correlated Diffusion Imaging [82.74877848011798]
The prevalence of breast cancer continues to grow, affecting about 300,000 females in the United States in 2023.
The gold-standard Scarff-Bloom-Richardson (SBR) grade has been shown to consistently indicate a patient's response to chemotherapy.
In this paper, we study the efficacy of deep learning for breast cancer grading based on synthetic correlated diffusion (CDI$s$) imaging.
arXiv Detail & Related papers (2023-04-12T15:08:34Z) - A Multi-Institutional Open-Source Benchmark Dataset for Breast Cancer
Clinical Decision Support using Synthetic Correlated Diffusion Imaging Data [82.74877848011798]
Cancer-Net BCa is a multi-institutional open-source benchmark dataset of volumetric CDI$s$ imaging data of breast cancer patients.
Cancer-Net BCa is publicly available as a part of a global open-source initiative dedicated to accelerating advancement in machine learning to aid clinicians in the fight against cancer.
arXiv Detail & Related papers (2023-04-12T05:41:44Z) - RADIFUSION: A multi-radiomics deep learning based breast cancer risk
prediction model using sequential mammographic images with image attention
and bilateral asymmetry refinement [0.36355629235144304]
Our study highlights the importance of various deep learning mechanisms, such as image attention radiomic features, gating mechanism, and bilateral asymmetry-based fine-tuning.
Our findings suggest that RADIfusion can provide clinicians with a powerful tool for breast cancer risk assessment.
arXiv Detail & Related papers (2023-04-01T08:18:13Z) - Enhancing Clinical Support for Breast Cancer with Deep Learning Models
using Synthetic Correlated Diffusion Imaging [66.63200823918429]
We investigate enhancing clinical support for breast cancer with deep learning models.
We leverage a volumetric convolutional neural network to learn deep radiomic features from a pre-treatment cohort.
We find that the proposed approach can achieve better performance for both grade and post-treatment response prediction.
arXiv Detail & Related papers (2022-11-10T03:02:12Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Deep-LIBRA: Artificial intelligence method for robust quantification of
breast density with independent validation in breast cancer risk assessment [2.0369879867185143]
Current federal legislation mandates reporting of breast density for all women undergoing breast screening.
We introduce an artificial intelligence (AI) method to estimate breast percentage density (PD) from digital mammograms.
arXiv Detail & Related papers (2020-11-13T15:21:17Z) - CorrSigNet: Learning CORRelated Prostate Cancer SIGnatures from
Radiology and Pathology Images for Improved Computer Aided Diagnosis [1.63324350193061]
We propose CorrSigNet, an automated two-step model that localizes prostate cancer on MRI.
First, the model learns MRI signatures of cancer that are correlated with corresponding histopathology features.
Second, the model uses the learned correlated MRI features to train a Convolutional Neural Network to localize prostate cancer.
arXiv Detail & Related papers (2020-07-31T23:44:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.