Roll With the Punches: Expansion and Shrinkage of Soft Label Selection
for Semi-supervised Fine-Grained Learning
- URL: http://arxiv.org/abs/2312.12237v1
- Date: Tue, 19 Dec 2023 15:22:37 GMT
- Title: Roll With the Punches: Expansion and Shrinkage of Soft Label Selection
for Semi-supervised Fine-Grained Learning
- Authors: Yue Duan, Zhen Zhao, Lei Qi, Luping Zhou, Lei Wang, Yinghuan Shi
- Abstract summary: We propose Soft Label Selection with Confidence-Aware Clustering based on Class Transition Tracking (SoC)
Our approach demonstrates its superior performance in SS-FGVC.
- Score: 42.71454054383897
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While semi-supervised learning (SSL) has yielded promising results, the more
realistic SSL scenario remains to be explored, in which the unlabeled data
exhibits extremely high recognition difficulty, e.g., fine-grained visual
classification in the context of SSL (SS-FGVC). The increased recognition
difficulty on fine-grained unlabeled data spells disaster for pseudo-labeling
accuracy, resulting in poor performance of the SSL model. To tackle this
challenge, we propose Soft Label Selection with Confidence-Aware Clustering
based on Class Transition Tracking (SoC) by reconstructing the pseudo-label
selection process by jointly optimizing Expansion Objective and Shrinkage
Objective, which is based on a soft label manner. Respectively, the former
objective encourages soft labels to absorb more candidate classes to ensure the
attendance of ground-truth class, while the latter encourages soft labels to
reject more noisy classes, which is theoretically proved to be equivalent to
entropy minimization. In comparisons with various state-of-the-art methods, our
approach demonstrates its superior performance in SS-FGVC. Checkpoints and
source code are available at https://github.com/NJUyued/SoC4SS-FGVC.
Related papers
- Learning Label Refinement and Threshold Adjustment for Imbalanced Semi-Supervised Learning [6.904448748214652]
Semi-supervised learning algorithms struggle to perform well when exposed to imbalanced training data.
We introduce SEmi-supervised learning with pseudo-label optimization based on VALidation data (SEVAL)
SEVAL adapts to specific tasks with improved pseudo-labels accuracy and ensures pseudo-labels correctness on a per-class basis.
arXiv Detail & Related papers (2024-07-07T13:46:22Z) - SoftMatch: Addressing the Quantity-Quality Trade-off in Semi-supervised
Learning [101.86916775218403]
This paper revisits the popular pseudo-labeling methods via a unified sample weighting formulation.
We propose SoftMatch to overcome the trade-off by maintaining both high quantity and high quality of pseudo-labels during training.
In experiments, SoftMatch shows substantial improvements across a wide variety of benchmarks, including image, text, and imbalanced classification.
arXiv Detail & Related papers (2023-01-26T03:53:25Z) - PercentMatch: Percentile-based Dynamic Thresholding for Multi-Label
Semi-Supervised Classification [64.39761523935613]
We propose a percentile-based threshold adjusting scheme to dynamically alter the score thresholds of positive and negative pseudo-labels for each class during the training.
We achieve strong performance on Pascal VOC2007 and MS-COCO datasets when compared to recent SSL methods.
arXiv Detail & Related papers (2022-08-30T01:27:48Z) - Complementing Semi-Supervised Learning with Uncertainty Quantification [6.612035830987296]
We propose a novel unsupervised uncertainty-aware objective that relies on aleatoric and epistemic uncertainty quantification.
Our results outperform the state-of-the-art results on complex datasets such as CIFAR-100 and Mini-ImageNet.
arXiv Detail & Related papers (2022-07-22T00:15:02Z) - Transductive CLIP with Class-Conditional Contrastive Learning [68.51078382124331]
We propose Transductive CLIP, a novel framework for learning a classification network with noisy labels from scratch.
A class-conditional contrastive learning mechanism is proposed to mitigate the reliance on pseudo labels.
ensemble labels is adopted as a pseudo label updating strategy to stabilize the training of deep neural networks with noisy labels.
arXiv Detail & Related papers (2022-06-13T14:04:57Z) - In Defense of Pseudo-Labeling: An Uncertainty-Aware Pseudo-label
Selection Framework for Semi-Supervised Learning [53.1047775185362]
Pseudo-labeling (PL) is a general SSL approach that does not have this constraint but performs relatively poorly in its original formulation.
We argue that PL underperforms due to the erroneous high confidence predictions from poorly calibrated models.
We propose an uncertainty-aware pseudo-label selection (UPS) framework which improves pseudo labeling accuracy by drastically reducing the amount of noise encountered in the training process.
arXiv Detail & Related papers (2021-01-15T23:29:57Z) - PseudoSeg: Designing Pseudo Labels for Semantic Segmentation [78.35515004654553]
We present a re-design of pseudo-labeling to generate structured pseudo labels for training with unlabeled or weakly-labeled data.
We demonstrate the effectiveness of the proposed pseudo-labeling strategy in both low-data and high-data regimes.
arXiv Detail & Related papers (2020-10-19T17:59:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.