Interpretation of Chest x-rays affected by bullets using deep transfer
learning
- URL: http://arxiv.org/abs/2203.13461v1
- Date: Fri, 25 Mar 2022 05:53:45 GMT
- Title: Interpretation of Chest x-rays affected by bullets using deep transfer
learning
- Authors: Shaheer Khan, Azib Farooq, Israr Khan, Muhammad Gulraiz Khan, Abdul
Razzaq
- Abstract summary: Deep learning in radiology provides the opportunity to classify, detect and segment different diseases automatically.
In the proposed study, we worked on a non-trivial aspect of medical imaging where we classified and localized the X-Rays affected by bullets.
This is the first study on the detection and classification of radiographs affected by bullets using deep learning.
- Score: 0.8189696720657246
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The potential of deep learning, especially in medical imaging, initiated
astonishing results and improved the methodologies after every passing day.
Deep learning in radiology provides the opportunity to classify, detect and
segment different diseases automatically. In the proposed study, we worked on a
non-trivial aspect of medical imaging where we classified and localized the
X-Rays affected by bullets. We tested Images on different classification and
localization models to get considerable accuracy. The replicated data set used
in the study was replicated on different images of chest X-Rays. The proposed
model worked not only on chest radiographs but other body organs X-rays like
leg, abdomen, head, even the training dataset based on chest radiographs.
Custom models have been used for classification and localization purposes after
tuning parameters. Finally, the results of our findings manifested using
different frameworks. This might assist the research enlightening towards this
field. To the best of our knowledge, this is the first study on the detection
and classification of radiographs affected by bullets using deep learning.
Related papers
- LeDNet: Localization-enabled Deep Neural Network for Multi-Label Radiography Image Classification [0.1227734309612871]
Multi-label radiography image classification has long been a topic of interest in neural networks research.
We will use the chest x-ray images to detect thoracic diseases for this purpose.
We propose a combination of localization and deep learning algorithms called LeDNet to predict thoracic diseases with higher accuracy.
arXiv Detail & Related papers (2024-07-04T13:46:30Z) - Act Like a Radiologist: Radiology Report Generation across Anatomical Regions [50.13206214694885]
X-RGen is a radiologist-minded report generation framework across six anatomical regions.
In X-RGen, we seek to mimic the behaviour of human radiologists, breaking them down into four principal phases.
We enhance the recognition capacity of the image encoder by analysing images and reports across various regions.
arXiv Detail & Related papers (2023-05-26T07:12:35Z) - Improving Chest X-Ray Classification by RNN-based Patient Monitoring [0.34998703934432673]
We analyze how information about diagnosis can improve CNN-based image classification models.
We show that a model trained on additional patient history information outperforms a model trained without the information by a significant margin.
arXiv Detail & Related papers (2022-10-28T11:47:15Z) - Long-Tailed Classification of Thorax Diseases on Chest X-Ray: A New
Benchmark Study [75.05049024176584]
We present a benchmark study of the long-tailed learning problem in the specific domain of thorax diseases on chest X-rays.
We focus on learning from naturally distributed chest X-ray data, optimizing classification accuracy over not only the common "head" classes, but also the rare yet critical "tail" classes.
The benchmark consists of two chest X-ray datasets for 19- and 20-way thorax disease classification, containing classes with as many as 53,000 and as few as 7 labeled training images.
arXiv Detail & Related papers (2022-08-29T04:34:15Z) - A Deep Learning Technique using a Sequence of Follow Up X-Rays for
Disease classification [3.3345134768053635]
The ability to predict lung and heart based diseases using deep learning techniques is central to many researchers.
We present a hypothesis that X-rays of patients included with the follow up history of their most recent three chest X-ray images would perform better in disease classification.
arXiv Detail & Related papers (2022-03-28T19:58:47Z) - Generative Residual Attention Network for Disease Detection [51.60842580044539]
We present a novel approach for disease generation in X-rays using a conditional generative adversarial learning.
We generate a corresponding radiology image in a target domain while preserving the identity of the patient.
We then use the generated X-ray image in the target domain to augment our training to improve the detection performance.
arXiv Detail & Related papers (2021-10-25T14:15:57Z) - Cross-Modal Contrastive Learning for Abnormality Classification and
Localization in Chest X-rays with Radiomics using a Feedback Loop [63.81818077092879]
We propose an end-to-end semi-supervised cross-modal contrastive learning framework for medical images.
We first apply an image encoder to classify the chest X-rays and to generate the image features.
The radiomic features are then passed through another dedicated encoder to act as the positive sample for the image features generated from the same chest X-ray.
arXiv Detail & Related papers (2021-04-11T09:16:29Z) - Variational Knowledge Distillation for Disease Classification in Chest
X-Rays [102.04931207504173]
We propose itvariational knowledge distillation (VKD), which is a new probabilistic inference framework for disease classification based on X-rays.
We demonstrate the effectiveness of our method on three public benchmark datasets with paired X-ray images and EHRs.
arXiv Detail & Related papers (2021-03-19T14:13:56Z) - Joint Modeling of Chest Radiographs and Radiology Reports for Pulmonary
Edema Assessment [39.60171837961607]
We develop a neural network model that is trained on both images and free-text to assess pulmonary edema severity from chest radiographs at inference time.
Our experimental results suggest that the joint image-text representation learning improves the performance of pulmonary edema assessment.
arXiv Detail & Related papers (2020-08-22T17:28:39Z) - Learning Invariant Feature Representation to Improve Generalization
across Chest X-ray Datasets [55.06983249986729]
We show that a deep learning model performing well when tested on the same dataset as training data starts to perform poorly when it is tested on a dataset from a different source.
By employing an adversarial training strategy, we show that a network can be forced to learn a source-invariant representation.
arXiv Detail & Related papers (2020-08-04T07:41:15Z) - Evaluation of Contemporary Convolutional Neural Network Architectures
for Detecting COVID-19 from Chest Radiographs [0.0]
We train and evaluate three model architectures, proposed for chest radiograph analysis, under varying conditions.
We find issues that discount the impressive model performances proposed by contemporary studies on this subject.
arXiv Detail & Related papers (2020-06-30T15:22:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.