FedTriNet: A Pseudo Labeling Method with Three Players for Federated
Semi-supervised Learning
- URL: http://arxiv.org/abs/2109.05612v1
- Date: Sun, 12 Sep 2021 21:00:25 GMT
- Title: FedTriNet: A Pseudo Labeling Method with Three Players for Federated
Semi-supervised Learning
- Authors: Liwei Che and Zewei Long and Jiaqi Wang and Yaqing Wang and Houping
Xiao and Fenglong Ma
- Abstract summary: In this paper, we propose a novel federated semi-supervised learning method named FedTriNet.
In particular, we propose to use three networks and a dynamic quality control mechanism to generate high-quality pseudo labels for unlabeled data.
Experimental results on three publicly available datasets show that the proposed FedTriNet outperforms state-of-the-art baselines.
- Score: 24.720014822365684
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning has shown great potentials for the distributed data
utilization and privacy protection. Most existing federated learning approaches
focus on the supervised setting, which means all the data stored in each client
has labels. However, in real-world applications, the client data are impossible
to be fully labeled. Thus, how to exploit the unlabeled data should be a new
challenge for federated learning. Although a few studies are attempting to
overcome this challenge, they may suffer from information leakage or misleading
information usage problems. To tackle these issues, in this paper, we propose a
novel federated semi-supervised learning method named FedTriNet, which consists
of two learning phases. In the first phase, we pre-train FedTriNet using
labeled data with FedAvg. In the second phase, we aim to make most of the
unlabeled data to help model learning. In particular, we propose to use three
networks and a dynamic quality control mechanism to generate high-quality
pseudo labels for unlabeled data, which are added to the training set. Finally,
FedTriNet uses the new training set to retrain the model. Experimental results
on three publicly available datasets show that the proposed FedTriNet
outperforms state-of-the-art baselines under both IID and Non-IID settings.
Related papers
- FedAnchor: Enhancing Federated Semi-Supervised Learning with Label
Contrastive Loss for Unlabeled Clients [19.3885479917635]
Federated learning (FL) is a distributed learning paradigm that facilitates collaborative training of a shared global model across devices.
We propose FedAnchor, an innovative FSSL method that introduces a unique double-head structure, called anchor head, paired with the classification head trained exclusively on labeled anchor data on the server.
Our approach mitigates the confirmation bias and overfitting issues associated with pseudo-labeling techniques based on high-confidence model prediction samples.
arXiv Detail & Related papers (2024-02-15T18:48:21Z) - Dual-Perspective Knowledge Enrichment for Semi-Supervised 3D Object
Detection [55.210991151015534]
We present a novel Dual-Perspective Knowledge Enrichment approach named DPKE for semi-supervised 3D object detection.
Our DPKE enriches the knowledge of limited training data, particularly unlabeled data, from two perspectives: data-perspective and feature-perspective.
arXiv Detail & Related papers (2024-01-10T08:56:07Z) - Distribution Shift Matters for Knowledge Distillation with Webly
Collected Images [91.66661969598755]
We propose a novel method dubbed Knowledge Distillation between Different Distributions" (KD$3$)
We first dynamically select useful training instances from the webly collected data according to the combined predictions of teacher network and student network.
We also build a new contrastive learning block called MixDistribution to generate perturbed data with a new distribution for instance alignment.
arXiv Detail & Related papers (2023-07-21T10:08:58Z) - Hierarchical Supervision and Shuffle Data Augmentation for 3D
Semi-Supervised Object Detection [90.32180043449263]
State-of-the-art 3D object detectors are usually trained on large-scale datasets with high-quality 3D annotations.
A natural remedy is to adopt semi-supervised learning (SSL) by leveraging a limited amount of labeled samples and abundant unlabeled samples.
This paper introduces a novel approach of Hierarchical Supervision and Shuffle Data Augmentation (HSSDA), which is a simple yet effective teacher-student framework.
arXiv Detail & Related papers (2023-04-04T02:09:32Z) - FedIL: Federated Incremental Learning from Decentralized Unlabeled Data
with Convergence Analysis [23.70951896315126]
This work considers the server with a small labeled dataset and intends to use unlabeled data in multiple clients for semi-supervised learning.
We propose a new framework with a generalized model, Federated Incremental Learning (FedIL), to address the problem of how to utilize labeled data in the server and unlabeled data in clients separately.
arXiv Detail & Related papers (2023-02-23T07:12:12Z) - Open-Set Semi-Supervised Learning for 3D Point Cloud Understanding [62.17020485045456]
It is commonly assumed in semi-supervised learning (SSL) that the unlabeled data are drawn from the same distribution as that of the labeled ones.
We propose to selectively utilize unlabeled data through sample weighting, so that only conducive unlabeled data would be prioritized.
arXiv Detail & Related papers (2022-05-02T16:09:17Z) - PointMatch: A Consistency Training Framework for Weakly Supervised
Semantic Segmentation of 3D Point Clouds [117.77841399002666]
We propose a novel framework, PointMatch, that stands on both data and label, by applying consistency regularization to sufficiently probe information from data itself.
The proposed PointMatch achieves the state-of-the-art performance under various weakly-supervised schemes on both ScanNet-v2 and S3DIS datasets.
arXiv Detail & Related papers (2022-02-22T07:26:31Z) - FedCon: A Contrastive Framework for Federated Semi-Supervised Learning [26.520767887801142]
Federated Semi-Supervised Learning (FedSSL) has gained rising attention from both academic and industrial researchers.
FedCon introduces a new learning paradigm, i.e., contractive learning, to FedSSL.
arXiv Detail & Related papers (2021-09-09T19:47:21Z) - Federated Semi-Supervised Learning with Inter-Client Consistency &
Disjoint Learning [78.88007892742438]
We study two essential scenarios of Federated Semi-Supervised Learning (FSSL) based on the location of the labeled data.
We propose a novel method to tackle the problems, which we refer to as Federated Matching (FedMatch)
arXiv Detail & Related papers (2020-06-22T09:43:41Z) - Exploiting Unlabeled Data in Smart Cities using Federated Learning [2.362412515574206]
Federated learning is an effective technique to avoid privacy infringement as well as increase the utilization of the data.
We propose a semi-supervised federated learning method called FedSem that exploits unlabeled data.
We show that FedSem can improve accuracy up to 8% by utilizing the unlabeled data in the learning process.
arXiv Detail & Related papers (2020-01-10T13:25:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.