Visualizing chest X-ray dataset biases using GANs
- URL: http://arxiv.org/abs/2305.00147v2
- Date: Tue, 5 Sep 2023 19:12:44 GMT
- Title: Visualizing chest X-ray dataset biases using GANs
- Authors: Hao Liang, Kevin Ni, Guha Balakrishnan
- Abstract summary: Recent work demonstrates that images from various chest X-ray datasets contain visual features that are strongly correlated with protected demographic attributes like race and gender.
This finding raises issues of fairness, since some of these factors may be used by downstream algorithms for clinical predictions.
In this work, we propose a framework, using generative adversarial networks (GANs), to visualize what features are most different between X-rays belonging to two demographic subgroups.
- Score: 8.61315908717562
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent work demonstrates that images from various chest X-ray datasets
contain visual features that are strongly correlated with protected demographic
attributes like race and gender. This finding raises issues of fairness, since
some of these factors may be used by downstream algorithms for clinical
predictions. In this work, we propose a framework, using generative adversarial
networks (GANs), to visualize what features are most different between X-rays
belonging to two demographic subgroups.
Related papers
- Fairness and Robustness of CLIP-Based Models for Chest X-rays [9.082174810187931]
We extensively evaluate six widely used CLIP-based models on chest X-ray classification using three publicly available datasets.<n>We assess the models fairness across six conditions and patient subgroups based on age, sex, and race.<n>Our results indicate performance gaps between patients of different ages, but more equitable results for the other attributes.
arXiv Detail & Related papers (2025-07-28T19:25:16Z) - Causal Representation Learning with Observational Grouping for CXR Classification [17.11125452239702]
Identifiable causal representation learning seeks to uncover the true causal relationships underlying a data generation process.<n>This work introduces the concept of grouping observations to learn identifiable representations for disease classification in chest X-rays via an end-to-end framework.
arXiv Detail & Related papers (2025-06-25T16:17:36Z) - Enhancing Prohibited Item Detection through X-ray-Specific Augmentation and Contextual Feature Integration [81.11400642272976]
X-ray prohibited item detection faces challenges due to the long-tail distribution and unique characteristics of X-ray imaging.
Traditional data augmentation strategies, such as copy-paste and mixup, are ineffective at improving the detection of rare items.
We propose the X-ray Imaging-driven Detection Network (XIDNet) to address these challenges.
arXiv Detail & Related papers (2024-11-27T06:13:56Z) - Vision-Language Generative Model for View-Specific Chest X-ray Generation [18.347723213970696]
ViewXGen is designed to overcome the limitations of existing methods to generate frontal-view chest X-rays.
Our approach takes into consideration the diverse view positions found in the dataset, enabling the generation of chest X-rays with specific views.
arXiv Detail & Related papers (2023-02-23T17:13:25Z) - Learning disentangled representations for explainable chest X-ray
classification using Dirichlet VAEs [68.73427163074015]
This study explores the use of the Dirichlet Variational Autoencoder (DirVAE) for learning disentangled latent representations of chest X-ray (CXR) images.
The predictive capacity of multi-modal latent representations learned by DirVAE models is investigated through implementation of an auxiliary multi-label classification task.
arXiv Detail & Related papers (2023-02-06T18:10:08Z) - Computer-aided Tuberculosis Diagnosis with Attribute Reasoning
Assistance [58.01014026139231]
We propose a new large-scale tuberculosis (TB) chest X-ray dataset (TBX-Att)
We establish an attribute-assisted weakly-supervised framework to classify and localize TB by leveraging the attribute information.
The proposed model is evaluated on the TBX-Att dataset and will serve as a solid baseline for future research.
arXiv Detail & Related papers (2022-07-01T07:50:35Z) - Contrastive Attention for Automatic Chest X-ray Report Generation [124.60087367316531]
In most cases, the normal regions dominate the entire chest X-ray image, and the corresponding descriptions of these normal regions dominate the final report.
We propose Contrastive Attention (CA) model, which compares the current input image with normal images to distill the contrastive information.
We achieve the state-of-the-art results on the two public datasets.
arXiv Detail & Related papers (2021-06-13T11:20:31Z) - Cross-Modal Contrastive Learning for Abnormality Classification and
Localization in Chest X-rays with Radiomics using a Feedback Loop [63.81818077092879]
We propose an end-to-end semi-supervised cross-modal contrastive learning framework for medical images.
We first apply an image encoder to classify the chest X-rays and to generate the image features.
The radiomic features are then passed through another dedicated encoder to act as the positive sample for the image features generated from the same chest X-ray.
arXiv Detail & Related papers (2021-04-11T09:16:29Z) - Learning Invariant Feature Representation to Improve Generalization
across Chest X-ray Datasets [55.06983249986729]
We show that a deep learning model performing well when tested on the same dataset as training data starts to perform poorly when it is tested on a dataset from a different source.
By employing an adversarial training strategy, we show that a network can be forced to learn a source-invariant representation.
arXiv Detail & Related papers (2020-08-04T07:41:15Z) - Weakly-Supervised Segmentation for Disease Localization in Chest X-Ray
Images [0.0]
We propose a novel approach to the semantic segmentation of medical chest X-ray images with only image-level class labels as supervision.
We show that this approach is applicable to chest X-rays for detecting an anomalous volume of air between the lung and the chest wall.
arXiv Detail & Related papers (2020-07-01T20:48:35Z) - Deep Mining External Imperfect Data for Chest X-ray Disease Screening [57.40329813850719]
We argue that incorporating an external CXR dataset leads to imperfect training data, which raises the challenges.
We formulate the multi-label disease classification problem as weighted independent binary tasks according to the categories.
Our framework simultaneously models and tackles the domain and label discrepancies, enabling superior knowledge mining ability.
arXiv Detail & Related papers (2020-06-06T06:48:40Z) - CheXclusion: Fairness gaps in deep chest X-ray classifiers [4.656202572362684]
We examine the extent to which state-of-the-art deep learning classifiers are biased with respect to protected attributes.
We train convolution neural networks to predict 14 diagnostic labels in 3 prominent public chest X-ray datasets.
We find that TPR disparities are not significantly correlated with a subgroup's proportional disease burden.
arXiv Detail & Related papers (2020-02-14T22:08:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.