Active Self-Semi-Supervised Learning for Few Labeled Samples Fast
Training
- URL: http://arxiv.org/abs/2203.04560v1
- Date: Wed, 9 Mar 2022 07:45:05 GMT
- Title: Active Self-Semi-Supervised Learning for Few Labeled Samples Fast
Training
- Authors: Ziting Wen, Oscar Pizarro, Stefan Williams
- Abstract summary: Semi-supervised learning has achieved great success in training with few annotations.
Low-quality labeled samples produced by random sampling make it difficult to continue to reduce the number of annotations.
We propose an active self-semi-supervised training framework that bootstraps semi-supervised models with good prior pseudo-labels.
- Score: 3.4806267677524896
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Faster training and fewer annotations are two key issues for applying deep
models to various practical domains. Now, semi-supervised learning has achieved
great success in training with few annotations. However, low-quality labeled
samples produced by random sampling make it difficult to continue to reduce the
number of annotations. In this paper we propose an active self-semi-supervised
training framework that bootstraps semi-supervised models with good prior
pseudo-labels, where the priors are obtained by label propagation over
self-supervised features. Because the accuracy of the prior is not only
affected by the quality of features, but also by the selection of the labeled
samples. We develop active learning and label propagation strategies to obtain
better prior pseudo-labels. Consequently, our framework can greatly improve the
performance of models with few annotations and greatly reduce the training
time. Experiments on three semi-supervised learning benchmarks demonstrate
effectiveness. Our method achieves similar accuracy to standard semi-supervised
approaches in about 1/3 of the training time, and even outperform them when
fewer annotations are available (84.10\% in CIFAR-10 with 10 labels).
Related papers
- Incremental Self-training for Semi-supervised Learning [56.57057576885672]
IST is simple yet effective and fits existing self-training-based semi-supervised learning methods.
We verify the proposed IST on five datasets and two types of backbone, effectively improving the recognition accuracy and learning speed.
arXiv Detail & Related papers (2024-04-14T05:02:00Z) - Semi-Supervised Class-Agnostic Motion Prediction with Pseudo Label
Regeneration and BEVMix [59.55173022987071]
We study the potential of semi-supervised learning for class-agnostic motion prediction.
Our framework adopts a consistency-based self-training paradigm, enabling the model to learn from unlabeled data.
Our method exhibits comparable performance to weakly and some fully supervised methods.
arXiv Detail & Related papers (2023-12-13T09:32:50Z) - All Points Matter: Entropy-Regularized Distribution Alignment for
Weakly-supervised 3D Segmentation [67.30502812804271]
Pseudo-labels are widely employed in weakly supervised 3D segmentation tasks where only sparse ground-truth labels are available for learning.
We propose a novel learning strategy to regularize the generated pseudo-labels and effectively narrow the gaps between pseudo-labels and model predictions.
arXiv Detail & Related papers (2023-05-25T08:19:31Z) - Active Self-Training for Weakly Supervised 3D Scene Semantic
Segmentation [17.27850877649498]
We introduce a method for weakly supervised segmentation of 3D scenes that combines self-training and active learning.
We demonstrate that our approach leads to an effective method that provides improvements in scene segmentation over previous works and baselines.
arXiv Detail & Related papers (2022-09-15T06:00:25Z) - Semi-supervised 3D Object Detection with Proficient Teachers [114.54835359657707]
Dominated point cloud-based 3D object detectors in autonomous driving scenarios rely heavily on the huge amount of accurately labeled samples.
Pseudo-Labeling methodology is commonly used for SSL frameworks, however, the low-quality predictions from the teacher model have seriously limited its performance.
We propose a new Pseudo-Labeling framework for semi-supervised 3D object detection, by enhancing the teacher model to a proficient one with several necessary designs.
arXiv Detail & Related papers (2022-07-26T04:54:03Z) - Open-Set Semi-Supervised Learning for 3D Point Cloud Understanding [62.17020485045456]
It is commonly assumed in semi-supervised learning (SSL) that the unlabeled data are drawn from the same distribution as that of the labeled ones.
We propose to selectively utilize unlabeled data through sample weighting, so that only conducive unlabeled data would be prioritized.
arXiv Detail & Related papers (2022-05-02T16:09:17Z) - Uncertainty-aware Self-training for Text Classification with Few Labels [54.13279574908808]
We study self-training as one of the earliest semi-supervised learning approaches to reduce the annotation bottleneck.
We propose an approach to improve self-training by incorporating uncertainty estimates of the underlying neural network.
We show our methods leveraging only 20-30 labeled samples per class for each task for training and for validation can perform within 3% of fully supervised pre-trained language models.
arXiv Detail & Related papers (2020-06-27T08:13:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.