Couple Learning: Mean Teacher method with pseudo-labels improves
semi-supervised deep learning results
- URL: http://arxiv.org/abs/2110.05809v1
- Date: Tue, 12 Oct 2021 08:11:39 GMT
- Title: Couple Learning: Mean Teacher method with pseudo-labels improves
semi-supervised deep learning results
- Authors: Rui Tao, Long Yan, Kazushige Ouchi, Xiangdong Wang
- Abstract summary: Mean Teacher has achieved state-of-the-art results in several semi-supervised learning benchmarks.
The proposed pseudo-labels generated model (PLG) can increase strongly-labeled data and weakly-labeled data.
The Couple Learning method can extract more information in the compound training data.
- Score: 5.218882272051637
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The recently proposed Mean Teacher has achieved state-of-the-art results in
several semi-supervised learning benchmarks. The Mean Teacher method can
exploit large-scale unlabeled data in a self-ensembling manner. In this paper,
an effective Couple Learning method based on a well-trained model and a Mean
Teacher model is proposed. The proposed pseudo-labels generated model (PLG) can
increase strongly-labeled data and weakly-labeled data to improve performance
of the Mean Teacher method. The Mean Teacher method can suppress noise in
pseudo-labels data. The Couple Learning method can extract more information in
the compound training data. These experimental results on Task 4 of the
DCASE2020 challenge demonstrate the superiority of the proposed method,
achieving about 39.18% F1-score on public eval set, outperforming 37.12% of the
baseline system by a significant margin.
Related papers
- Dual-Decoupling Learning and Metric-Adaptive Thresholding for Semi-Supervised Multi-Label Learning [81.83013974171364]
Semi-supervised multi-label learning (SSMLL) is a powerful framework for leveraging unlabeled data to reduce the expensive cost of collecting precise multi-label annotations.
Unlike semi-supervised learning, one cannot select the most probable label as the pseudo-label in SSMLL due to multiple semantics contained in an instance.
We propose a dual-perspective method to generate high-quality pseudo-labels.
arXiv Detail & Related papers (2024-07-26T09:33:53Z) - Collaboration of Teachers for Semi-supervised Object Detection [20.991741476731967]
We propose the Collaboration of Teachers Framework (CTF), which consists of multiple pairs of teacher and student models for training.
This framework greatly improves the utilization of unlabeled data and prevents the positive feedback cycle of unreliable pseudo-labels.
arXiv Detail & Related papers (2024-05-22T06:17:50Z) - Noisy Node Classification by Bi-level Optimization based Multi-teacher Distillation [17.50773984154023]
We propose a new multi-teacher distillation method based on bi-level optimization (namely BO-NNC) to conduct noisy node classification on the graph data.
Specifically, we first employ multiple self-supervised learning methods to train diverse teacher models, and then aggregate their predictions through a teacher weight matrix.
Furthermore, we design a new bi-level optimization strategy to dynamically adjust the teacher weight matrix based on the training progress of the student model.
arXiv Detail & Related papers (2024-04-27T12:19:08Z) - ESimCSE Unsupervised Contrastive Learning Jointly with UDA
Semi-Supervised Learning for Large Label System Text Classification Mode [4.708633772366381]
The ESimCSE model efficiently learns text vector representations using unlabeled data to achieve better classification results.
UDA is trained using unlabeled data through semi-supervised learning methods to improve the prediction performance of the models and stability.
adversarial training techniques FGM and PGD are used in the model training process to improve the robustness and reliability of the model.
arXiv Detail & Related papers (2023-04-19T03:44:23Z) - Active Teacher for Semi-Supervised Object Detection [80.10937030195228]
We propose a novel algorithm called Active Teacher for semi-supervised object detection (SSOD)
Active Teacher extends the teacher-student framework to an iterative version, where the label set is partially and gradually augmented by evaluating three key factors of unlabeled examples.
With this design, Active Teacher can maximize the effect of limited label information while improving the quality of pseudo-labels.
arXiv Detail & Related papers (2023-03-15T03:59:27Z) - Boosting Facial Expression Recognition by A Semi-Supervised Progressive
Teacher [54.50747989860957]
We propose a semi-supervised learning algorithm named Progressive Teacher (PT) to utilize reliable FER datasets as well as large-scale unlabeled expression images for effective training.
Experiments on widely-used databases RAF-DB and FERPlus validate the effectiveness of our method, which achieves state-of-the-art performance with accuracy of 89.57% on RAF-DB.
arXiv Detail & Related papers (2022-05-28T07:47:53Z) - Investigating a Baseline Of Self Supervised Learning Towards Reducing
Labeling Costs For Image Classification [0.0]
The study implements the kaggle.com' cats-vs-dogs dataset, Mnist and Fashion-Mnist to investigate the self-supervised learning task.
Results show that the pretext process in the self-supervised learning improves the accuracy around 15% in the downstream classification task.
arXiv Detail & Related papers (2021-08-17T06:43:05Z) - A Novel Perspective for Positive-Unlabeled Learning via Noisy Labels [49.990938653249415]
This research presents a methodology that assigns initial pseudo-labels to unlabeled data which is used as noisy-labeled data, and trains a deep neural network using the noisy-labeled data.
Experimental results demonstrate that the proposed method significantly outperforms the state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-03-08T11:46:02Z) - SLADE: A Self-Training Framework For Distance Metric Learning [75.54078592084217]
We present a self-training framework, SLADE, to improve retrieval performance by leveraging additional unlabeled data.
We first train a teacher model on the labeled data and use it to generate pseudo labels for the unlabeled data.
We then train a student model on both labels and pseudo labels to generate final feature embeddings.
arXiv Detail & Related papers (2020-11-20T08:26:10Z) - Deep Semi-supervised Knowledge Distillation for Overlapping Cervical
Cell Instance Segmentation [54.49894381464853]
We propose to leverage both labeled and unlabeled data for instance segmentation with improved accuracy by knowledge distillation.
We propose a novel Mask-guided Mean Teacher framework with Perturbation-sensitive Sample Mining.
Experiments show that the proposed method improves the performance significantly compared with the supervised method learned from labeled data only.
arXiv Detail & Related papers (2020-07-21T13:27:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.