CoDiM: Learning with Noisy Labels via Contrastive Semi-Supervised
Learning
- URL: http://arxiv.org/abs/2111.11652v1
- Date: Tue, 23 Nov 2021 04:56:40 GMT
- Title: CoDiM: Learning with Noisy Labels via Contrastive Semi-Supervised
Learning
- Authors: Xin Zhang, Zixuan Liu, Kaiwen Xiao, Tian Shen, Junzhou Huang, Wei
Yang, Dimitris Samaras, Xiao Han
- Abstract summary: Noisy label learning, semi-supervised learning, and contrastive learning are three different strategies for designing learning processes requiring less annotation cost.
We propose CSSL, a unified Contrastive Semi-Supervised Learning algorithm, and CoDiM, a novel algorithm for learning with noisy labels.
- Score: 58.107679606345165
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Labels are costly and sometimes unreliable. Noisy label learning,
semi-supervised learning, and contrastive learning are three different
strategies for designing learning processes requiring less annotation cost.
Semi-supervised learning and contrastive learning have been recently
demonstrated to improve learning strategies that address datasets with noisy
labels. Still, the inner connections between these fields as well as the
potential to combine their strengths together have only started to emerge. In
this paper, we explore further ways and advantages to fuse them. Specifically,
we propose CSSL, a unified Contrastive Semi-Supervised Learning algorithm, and
CoDiM (Contrastive DivideMix), a novel algorithm for learning with noisy
labels. CSSL leverages the power of classical semi-supervised learning and
contrastive learning technologies and is further adapted to CoDiM, which learns
robustly from multiple types and levels of label noise. We show that CoDiM
brings consistent improvements and achieves state-of-the-art results on
multiple benchmarks.
Related papers
- Channel-Wise Contrastive Learning for Learning with Noisy Labels [60.46434734808148]
We introduce channel-wise contrastive learning (CWCL) to distinguish authentic label information from noise.
Unlike conventional instance-wise contrastive learning (IWCL), CWCL tends to yield more nuanced and resilient features aligned with the authentic labels.
Our strategy is twofold: firstly, using CWCL to extract pertinent features to identify cleanly labeled samples, and secondly, progressively fine-tuning using these samples.
arXiv Detail & Related papers (2023-08-14T06:04:50Z) - Multi-Label Knowledge Distillation [86.03990467785312]
We propose a novel multi-label knowledge distillation method.
On one hand, it exploits the informative semantic knowledge from the logits by dividing the multi-label learning problem into a set of binary classification problems.
On the other hand, it enhances the distinctiveness of the learned feature representations by leveraging the structural information of label-wise embeddings.
arXiv Detail & Related papers (2023-08-12T03:19:08Z) - Unleashing the Potential of Regularization Strategies in Learning with
Noisy Labels [65.92994348757743]
We demonstrate that a simple baseline using cross-entropy loss, combined with widely used regularization strategies can outperform state-of-the-art methods.
Our findings suggest that employing a combination of regularization strategies can be more effective than intricate algorithms in tackling the challenges of learning with noisy labels.
arXiv Detail & Related papers (2023-07-11T05:58:20Z) - Learning with Neighbor Consistency for Noisy Labels [69.83857578836769]
We present a method for learning from noisy labels that leverages similarities between training examples in feature space.
We evaluate our method on datasets evaluating both synthetic (CIFAR-10, CIFAR-100) and realistic (mini-WebVision, Clothing1M, mini-ImageNet-Red) noise.
arXiv Detail & Related papers (2022-02-04T15:46:27Z) - Co-learning: Learning from Noisy Labels with Self-supervision [28.266156561454327]
Self-supervised learning works in the absence of labels and thus eliminates the negative impact of noisy labels.
Motivated by co-training with both supervised learning view and self-supervised learning view, we propose a simple yet effective method called Co-learning for learning with noisy labels.
arXiv Detail & Related papers (2021-08-05T06:20:51Z) - Learning active learning at the crossroads? evaluation and discussion [0.03807314298073299]
Active learning aims to reduce annotation cost by predicting which samples are useful for a human expert to label.
There is no best active learning strategy that consistently outperforms all others in all applications.
We present the results of a benchmark performed on 20 datasets that compares a strategy learned using a recent meta-learning algorithm with margin sampling.
arXiv Detail & Related papers (2020-12-16T10:35:43Z) - Combating noisy labels by agreement: A joint training method with
co-regularization [27.578738673827658]
We propose a robust learning paradigm called JoCoR, which aims to reduce the diversity of two networks during training.
We show that JoCoR is superior to many state-of-the-art approaches for learning with noisy labels.
arXiv Detail & Related papers (2020-03-05T16:42:41Z) - DivideMix: Learning with Noisy Labels as Semi-supervised Learning [111.03364864022261]
We propose DivideMix, a framework for learning with noisy labels.
Experiments on multiple benchmark datasets demonstrate substantial improvements over state-of-the-art methods.
arXiv Detail & Related papers (2020-02-18T06:20:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.