ECL: Class-Enhancement Contrastive Learning for Long-tailed Skin Lesion
Classification
- URL: http://arxiv.org/abs/2307.04136v1
- Date: Sun, 9 Jul 2023 09:29:15 GMT
- Title: ECL: Class-Enhancement Contrastive Learning for Long-tailed Skin Lesion
Classification
- Authors: Yilan Zhang, Jianqi Chen, Ke Wang, Fengying Xie
- Abstract summary: Skin image datasets often suffer from imbalanced data distribution, exacerbating the difficulty of computer-aided skin disease diagnosis.
We propose class-Enhancement Contrastive Learning (ECL), which enriches the information of minority classes and treats different classes equally.
- Score: 7.7379419801373475
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Skin image datasets often suffer from imbalanced data distribution,
exacerbating the difficulty of computer-aided skin disease diagnosis. Some
recent works exploit supervised contrastive learning (SCL) for this long-tailed
challenge. Despite achieving significant performance, these SCL-based methods
focus more on head classes, yet ignoring the utilization of information in tail
classes. In this paper, we propose class-Enhancement Contrastive Learning
(ECL), which enriches the information of minority classes and treats different
classes equally. For information enhancement, we design a hybrid-proxy model to
generate class-dependent proxies and propose a cycle update strategy for
parameters optimization. A balanced-hybrid-proxy loss is designed to exploit
relations between samples and proxies with different classes treated equally.
Taking both "imbalanced data" and "imbalanced diagnosis difficulty" into
account, we further present a balanced-weighted cross-entropy loss following
curriculum learning schedule. Experimental results on the classification of
imbalanced skin lesion data have demonstrated the superiority and effectiveness
of our method.
Related papers
- What Makes CLIP More Robust to Long-Tailed Pre-Training Data? A Controlled Study for Transferable Insights [67.72413262980272]
Severe data imbalance naturally exists among web-scale vision-language datasets.
We find CLIP pre-trained thereupon exhibits notable robustness to the data imbalance compared to supervised learning.
The robustness and discriminability of CLIP improve with more descriptive language supervision, larger data scale, and broader open-world concepts.
arXiv Detail & Related papers (2024-05-31T17:57:24Z) - Fairness Evolution in Continual Learning for Medical Imaging [47.52603262576663]
We study the behavior of Continual Learning (CL) strategies in medical imaging regarding classification performance.
We evaluate the Replay, Learning without Forgetting (LwF), LwF, and Pseudo-Label strategies.
LwF and Pseudo-Label exhibit optimal classification performance, but when including fairness metrics in the evaluation, it is clear that Pseudo-Label is less biased.
arXiv Detail & Related papers (2024-04-10T09:48:52Z) - Iterative Online Image Synthesis via Diffusion Model for Imbalanced
Classification [29.730360798234294]
We introduce an Iterative Online Image Synthesis framework to address the class imbalance problem in medical image classification.
Our framework incorporates two key modules, namely Online Image Synthesis (OIS) and Accuracy Adaptive Sampling (AAS)
To evaluate the effectiveness of our proposed method in addressing imbalanced classification, we conduct experiments on the HAM10000 and APTOS datasets.
arXiv Detail & Related papers (2024-03-13T10:51:18Z) - Uncertainty-guided Boundary Learning for Imbalanced Social Event
Detection [64.4350027428928]
We propose a novel uncertainty-guided class imbalance learning framework for imbalanced social event detection tasks.
Our model significantly improves social event representation and classification tasks in almost all classes, especially those uncertain ones.
arXiv Detail & Related papers (2023-10-30T03:32:04Z) - An Asymmetric Contrastive Loss for Handling Imbalanced Datasets [0.0]
We introduce an asymmetric version of CL, referred to as ACL, to address the problem of class imbalance.
In addition, we propose the asymmetric focal contrastive loss (AFCL) as a further generalization of both ACL and focal contrastive loss.
Results on the FMNIST and ISIC 2018 imbalanced datasets show that AFCL is capable of outperforming CL and FCL in terms of both weighted and unweighted classification accuracies.
arXiv Detail & Related papers (2022-07-14T17:30:13Z) - Deep Reinforcement Learning for Multi-class Imbalanced Training [64.9100301614621]
We introduce an imbalanced classification framework, based on reinforcement learning, for training extremely imbalanced data sets.
We formulate a custom reward function and episode-training procedure, specifically with the added capability of handling multi-class imbalanced training.
Using real-world clinical case studies, we demonstrate that our proposed framework outperforms current state-of-the-art imbalanced learning methods.
arXiv Detail & Related papers (2022-05-24T13:39:59Z) - SuperCon: Supervised Contrastive Learning for Imbalanced Skin Lesion
Classification [9.265557367859637]
SuperCon is a two-stage training strategy to overcome the class imbalance problem on skin lesion classification.
Our two-stage training strategy effectively addresses the class imbalance classification problem, and significantly improves existing works in terms of F1-score and AUC score.
arXiv Detail & Related papers (2022-02-11T15:19:36Z) - Semi-supervised learning for medical image classification using
imbalanced training data [11.87832944550453]
We propose Adaptive Blended Consistency Loss (ABCL) as a drop-in replacement for consistency loss in perturbation-based SSL methods.
ABCL counteracts data skew by adaptively mixing the target class distribution of the consistency loss in accordance with class frequency.
Our experiments with ABCL reveal improvements to unweighted average recall on two different imbalanced medical image classification datasets.
arXiv Detail & Related papers (2021-08-20T01:06:42Z) - Bootstrapping Your Own Positive Sample: Contrastive Learning With
Electronic Health Record Data [62.29031007761901]
This paper proposes a novel contrastive regularized clinical classification model.
We introduce two unique positive sampling strategies specifically tailored for EHR data.
Our framework yields highly competitive experimental results in predicting the mortality risk on real-world COVID-19 EHR data.
arXiv Detail & Related papers (2021-04-07T06:02:04Z) - Alleviating the Incompatibility between Cross Entropy Loss and Episode
Training for Few-shot Skin Disease Classification [76.89093364969253]
We propose to apply Few-Shot Learning to skin disease identification to address the extreme scarcity of training sample problem.
Based on a detailed analysis, we propose the Query-Relative (QR) loss, which proves superior to Cross Entropy (CE) under episode training.
We further strengthen the proposed QR loss with a novel adaptive hard margin strategy.
arXiv Detail & Related papers (2020-04-21T00:57:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.