Dual Class-Aware Contrastive Federated Semi-Supervised Learning
- URL: http://arxiv.org/abs/2211.08914v2
- Date: Sun, 7 May 2023 15:01:36 GMT
- Title: Dual Class-Aware Contrastive Federated Semi-Supervised Learning
- Authors: Qi Guo, Yong Qi, Saiyu Qi, Di Wu
- Abstract summary: We present a novel Federated Semi-Supervised Learning (FSSL) method called Dual Class-aware Contrastive Federated Semi-Supervised Learning (DCCFSSL)
By implementing a dual class-aware contrastive module, DCCFSSL establishes a unified training objective for different clients to tackle large deviations.
Our experiments show that DCCFSSL outperforms current state-of-the-art methods on three benchmark datasets.
- Score: 9.742389743497045
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated semi-supervised learning (FSSL), facilitates labeled clients and
unlabeled clients jointly training a global model without sharing private data.
Existing FSSL methods predominantly employ pseudo-labeling and consistency
regularization to exploit the knowledge of unlabeled data, achieving notable
success in raw data utilization. However, these training processes are hindered
by large deviations between uploaded local models of labeled and unlabeled
clients, as well as confirmation bias introduced by noisy pseudo-labels, both
of which negatively affect the global model's performance. In this paper, we
present a novel FSSL method called Dual Class-aware Contrastive Federated
Semi-Supervised Learning (DCCFSSL). This method accounts for both the local
class-aware distribution of each client's data and the global class-aware
distribution of all clients' data within the feature space. By implementing a
dual class-aware contrastive module, DCCFSSL establishes a unified training
objective for different clients to tackle large deviations and incorporates
contrastive information in the feature space to mitigate confirmation bias.
Moreover, DCCFSSL introduces an authentication-reweighted aggregation technique
to improve the server's aggregation robustness. Our comprehensive experiments
show that DCCFSSL outperforms current state-of-the-art methods on three
benchmark datasets and surpasses the FedAvg with relabeled unlabeled clients on
CIFAR-10, CIFAR-100, and STL-10 datasets. To our knowledge, we are the first to
present an FSSL method that utilizes only 10\% labeled clients, while still
achieving superior performance compared to standard federated supervised
learning, which uses all clients with labeled data.
Related papers
- (FL)$^2$: Overcoming Few Labels in Federated Semi-Supervised Learning [4.803231218533992]
Federated Learning (FL) is a distributed machine learning framework that trains accurate global models while preserving clients' privacy-sensitive data.
Most FL approaches assume that clients possess labeled data, which is often not the case in practice.
We propose $(FL)2$, a robust training method for unlabeled clients using sharpness-aware consistency regularization.
arXiv Detail & Related papers (2024-10-30T17:15:02Z) - Combating Data Imbalances in Federated Semi-supervised Learning with
Dual Regulators [40.12377870379059]
Federated semi-supervised learning (FSSL) emerges to train models from a small fraction of labeled data.
We propose a novel FSSL framework with dual regulators, FedDure.
We show that FedDure is superior to the existing methods across a wide range of settings.
arXiv Detail & Related papers (2023-07-11T15:45:03Z) - Towards Unbiased Training in Federated Open-world Semi-supervised
Learning [15.08153616709326]
We propose a novel Federatedopen-world Semi-Supervised Learning (FedoSSL) framework, which can solve the key challenge in distributed and open-world settings.
We adopt an uncertainty-aware suppressed loss to alleviate the biased training between locally unseen and globally unseen classes.
The proposed FedoSSL can be easily adapted to state-of-the-art FL methods, which is also validated via extensive experiments on benchmarks and real-world datasets.
arXiv Detail & Related papers (2023-05-01T11:12:37Z) - Federated Semi-Supervised Learning with Annotation Heterogeneity [57.12560313403097]
We propose a novel framework called Heterogeneously Annotated Semi-Supervised LEarning (HASSLE)
It is a dual-model framework with two models trained separately on labeled and unlabeled data.
The dual models can implicitly learn from both types of data across different clients, although each dual model is only trained locally on a single type of data.
arXiv Detail & Related papers (2023-03-04T16:04:49Z) - Knowledge-Aware Federated Active Learning with Non-IID Data [75.98707107158175]
We propose a federated active learning paradigm to efficiently learn a global model with limited annotation budget.
The main challenge faced by federated active learning is the mismatch between the active sampling goal of the global model on the server and that of the local clients.
We propose Knowledge-Aware Federated Active Learning (KAFAL), which consists of Knowledge-Specialized Active Sampling (KSAS) and Knowledge-Compensatory Federated Update (KCFU)
arXiv Detail & Related papers (2022-11-24T13:08:43Z) - Federated Semi-Supervised Learning with Prototypical Networks [18.82809442813657]
We propose ProtoFSSL, a novel FSSL approach based on prototypical networks.
In ProtoFSSL, clients share knowledge with each other via lightweight prototypes.
Compared to a FSSL approach based on weight sharing, the prototype-based inter-client knowledge sharing significantly reduces both communication and computation costs.
arXiv Detail & Related papers (2022-05-27T11:55:29Z) - Class-Aware Contrastive Semi-Supervised Learning [51.205844705156046]
We propose a general method named Class-aware Contrastive Semi-Supervised Learning (CCSSL) to improve pseudo-label quality and enhance the model's robustness in the real-world setting.
Our proposed CCSSL has significant performance improvements over the state-of-the-art SSL methods on the standard datasets CIFAR100 and STL10.
arXiv Detail & Related papers (2022-03-04T12:18:23Z) - Towards Fair Federated Learning with Zero-Shot Data Augmentation [123.37082242750866]
Federated learning has emerged as an important distributed learning paradigm, where a server aggregates a global model from many client-trained models while having no access to the client data.
We propose a novel federated learning system that employs zero-shot data augmentation on under-represented data to mitigate statistical heterogeneity and encourage more uniform accuracy performance across clients in federated networks.
We study two variants of this scheme, Fed-ZDAC (federated learning with zero-shot data augmentation at the clients) and Fed-ZDAS (federated learning with zero-shot data augmentation at the server).
arXiv Detail & Related papers (2021-04-27T18:23:54Z) - Federated Unsupervised Representation Learning [56.715917111878106]
We formulate a new problem in federated learning called Federated Unsupervised Representation Learning (FURL) to learn a common representation model without supervision.
FedCA is composed of two key modules: dictionary module to aggregate the representations of samples from each client and share with all clients for consistency of representation space and alignment module to align the representation of each client on a base model trained on a public data.
arXiv Detail & Related papers (2020-10-18T13:28:30Z) - Federated Semi-Supervised Learning with Inter-Client Consistency &
Disjoint Learning [78.88007892742438]
We study two essential scenarios of Federated Semi-Supervised Learning (FSSL) based on the location of the labeled data.
We propose a novel method to tackle the problems, which we refer to as Federated Matching (FedMatch)
arXiv Detail & Related papers (2020-06-22T09:43:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.