Reliable Tuberculosis Detection using Chest X-ray with Deep Learning,
Segmentation and Visualization
- URL: http://arxiv.org/abs/2007.14895v1
- Date: Wed, 29 Jul 2020 15:11:34 GMT
- Title: Reliable Tuberculosis Detection using Chest X-ray with Deep Learning,
Segmentation and Visualization
- Authors: Tawsifur Rahman, Amith Khandakar, Muhammad Abdul Kadir, Khandaker R.
Islam, Khandaker F. Islam, Rashid Mazhar, Tahir Hamid, Mohammad T. Islam,
Zaid B. Mahbub, Mohamed Arselene Ayari, Muhammad E. H. Chowdhury
- Abstract summary: Tuberculosis is a chronic lung disease that occurs due to bacterial infection and is one of the top 10 leading causes of death.
We have detected TB reliably from the chest X-ray images using image pre-processing, data augmentation, image segmentation, and deep-learning classification techniques.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tuberculosis (TB) is a chronic lung disease that occurs due to bacterial
infection and is one of the top 10 leading causes of death. Accurate and early
detection of TB is very important, otherwise, it could be life-threatening. In
this work, we have detected TB reliably from the chest X-ray images using image
pre-processing, data augmentation, image segmentation, and deep-learning
classification techniques. Several public databases were used to create a
database of 700 TB infected and 3500 normal chest X-ray images for this study.
Nine different deep CNNs (ResNet18, ResNet50, ResNet101, ChexNet, InceptionV3,
Vgg19, DenseNet201, SqueezeNet, and MobileNet), which were used for transfer
learning from their pre-trained initial weights and trained, validated and
tested for classifying TB and non-TB normal cases. Three different experiments
were carried out in this work: segmentation of X-ray images using two different
U-net models, classification using X-ray images, and segmented lung images. The
accuracy, precision, sensitivity, F1-score, specificity in the detection of
tuberculosis using X-ray images were 97.07 %, 97.34 %, 97.07 %, 97.14 % and
97.36 % respectively. However, segmented lungs for the classification
outperformed than whole X-ray image-based classification and accuracy,
precision, sensitivity, F1-score, specificity were 99.9 %, 99.91 %, 99.9 %,
99.9 %, and 99.52 % respectively. The paper also used a visualization technique
to confirm that CNN learns dominantly from the segmented lung regions results
in higher detection accuracy. The proposed method with state-of-the-art
performance can be useful in the computer-aided faster diagnosis of
tuberculosis.
Related papers
- Few-Shot Learning Approach on Tuberculosis Classification Based on Chest X-Ray Images [0.0]
Class imbalance in TB chest X-ray datasets presents a challenge for accurate classification.
We propose a few-shot learning approach using the Prototypical Network algorithm to address this issue.
Experimental results demonstrate classification accuracies of 98.93% for ResNet-18, 98.60% for ResNet-50, and 33.33% for VGG16.
arXiv Detail & Related papers (2024-09-18T02:15:01Z) - Attention-based Saliency Maps Improve Interpretability of Pneumothorax
Classification [52.77024349608834]
To investigate chest radiograph (CXR) classification performance of vision transformers (ViT) and interpretability of attention-based saliency.
ViTs were fine-tuned for lung disease classification using four public data sets: CheXpert, Chest X-Ray 14, MIMIC CXR, and VinBigData.
ViTs had comparable CXR classification AUCs compared with state-of-the-art CNNs.
arXiv Detail & Related papers (2023-03-03T12:05:41Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Vision Transformers for femur fracture classification [59.99241204074268]
The Vision Transformer (ViT) was able to correctly predict 83% of the test images.
Good results were obtained in sub-fractures with the largest and richest dataset ever.
arXiv Detail & Related papers (2021-08-07T10:12:42Z) - Deep Learning Methods for Screening Pulmonary Tuberculosis Using Chest
X-rays [0.974672460306765]
The presented deep learning pipeline consists of three different state of the art deep learning architectures, to generate, segment and classify lung X-rays.
We were able to achieve classification accuracy of 97.1% (Youden's index-0.941,sensitivity of 97.9% and specificity of 96.2%) which is a considerable improvement compared to the existing work in the literature.
arXiv Detail & Related papers (2020-12-25T14:21:35Z) - Sensitivity and Specificity Evaluation of Deep Learning Models for
Detection of Pneumoperitoneum on Chest Radiographs [0.8437813529429724]
State-of-the-art deep learning models (ResNet101, InceptionV3, DenseNet161, and ResNeXt101) were trained on a subset of this dataset.
DenseNet161 model was able to accurately classify radiographs from different imaging systems.
arXiv Detail & Related papers (2020-10-17T21:41:53Z) - Improving performance of CNN to predict likelihood of COVID-19 using
chest X-ray images with preprocessing algorithms [0.3180570080674292]
The study demonstrates the feasibility of developing a computer-aided diagnosis scheme of chest X-ray images.
A dataset of 8,474 chest X-ray images is used to train and test the CNN-based CAD scheme.
The testing results achieve 94.0% of overall accuracy in classifying three classes and 98.6% accuracy in detecting Covid-19 infected cases.
arXiv Detail & Related papers (2020-06-11T16:45:46Z) - Predicting COVID-19 Pneumonia Severity on Chest X-ray with Deep Learning [57.00601760750389]
We present a severity score prediction model for COVID-19 pneumonia for frontal chest X-ray images.
Such a tool can gauge severity of COVID-19 lung infections that can be used for escalation or de-escalation of care.
arXiv Detail & Related papers (2020-05-24T23:13:16Z) - Y-Net for Chest X-Ray Preprocessing: Simultaneous Classification of
Geometry and Segmentation of Annotations [70.0118756144807]
This work introduces a general pre-processing step for chest x-ray input into machine learning algorithms.
A modified Y-Net architecture based on the VGG11 encoder is used to simultaneously learn geometric orientation and segmentation of radiographs.
Results were evaluated by expert clinicians, with acceptable geometry in 95.8% and annotation mask in 96.2%, compared to 27.0% and 34.9% respectively in control images.
arXiv Detail & Related papers (2020-05-08T02:16:17Z) - JCS: An Explainable COVID-19 Diagnosis System by Joint Classification
and Segmentation [95.57532063232198]
coronavirus disease 2019 (COVID-19) has caused a pandemic disease in over 200 countries.
To control the infection, identifying and separating the infected people is the most crucial step.
This paper develops a novel Joint Classification and (JCS) system to perform real-time and explainable COVID-19 chest CT diagnosis.
arXiv Detail & Related papers (2020-04-15T12:30:40Z) - Classification of COVID-19 in chest X-ray images using DeTraC deep
convolutional neural network [6.381149074212898]
Chest X-ray is the first imaging technique that plays an important role in the diagnosis of COVID-19 disease.
Due to the high availability of large-scale annotated image datasets, great success has been achieved using convolutional neural networks (CNNs) for image recognition and classification.
Thanks to transfer learning, an effective mechanism that can provide a promising solution by transferring knowledge from generic object recognition tasks to domain-specific tasks.
arXiv Detail & Related papers (2020-03-26T15:18:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.