Knowledge-Enhanced Semi-Supervised Federated Learning for Aggregating
Heterogeneous Lightweight Clients in IoT
- URL: http://arxiv.org/abs/2303.02668v1
- Date: Sun, 5 Mar 2023 13:19:10 GMT
- Title: Knowledge-Enhanced Semi-Supervised Federated Learning for Aggregating
Heterogeneous Lightweight Clients in IoT
- Authors: Jiaqi Wang, Shenglai Zeng, Zewei Long, Yaqing Wang, Houping Xiao,
Fenglong Ma
- Abstract summary: Federated learning (FL) enables multiple clients to train models collaboratively without sharing local data.
We propose pFedKnow, which generates lightweight personalized client models via neural network pruning techniques to reduce communication cost.
Experiment results on both image and text datasets show that the proposed pFedKnow outperforms state-of-the-art baselines.
- Score: 34.128674870180596
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) enables multiple clients to train models
collaboratively without sharing local data, which has achieved promising
results in different areas, including the Internet of Things (IoT). However,
end IoT devices do not have abilities to automatically annotate their collected
data, which leads to the label shortage issue at the client side. To
collaboratively train an FL model, we can only use a small number of labeled
data stored on the server. This is a new yet practical scenario in federated
learning, i.e., labels-at-server semi-supervised federated learning (SemiFL).
Although several SemiFL approaches have been proposed recently, none of them
can focus on the personalization issue in their model design. IoT environments
make SemiFL more challenging, as we need to take device computational
constraints and communication cost into consideration simultaneously. To tackle
these new challenges together, we propose a novel SemiFL framework named
pFedKnow. pFedKnow generates lightweight personalized client models via neural
network pruning techniques to reduce communication cost. Moreover, it
incorporates pretrained large models as prior knowledge to guide the
aggregation of personalized client models and further enhance the framework
performance. Experiment results on both image and text datasets show that the
proposed pFedKnow outperforms state-of-the-art baselines as well as reducing
considerable communication cost. The source code of the proposed pFedKnow is
available at https://github.com/JackqqWang/pfedknow/tree/master.
Related papers
- Communication Efficient ConFederated Learning: An Event-Triggered SAGA
Approach [67.27031215756121]
Federated learning (FL) is a machine learning paradigm that targets model training without gathering the local data over various data sources.
Standard FL, which employs a single server, can only support a limited number of users, leading to degraded learning capability.
In this work, we consider a multi-server FL framework, referred to as emphConfederated Learning (CFL) in order to accommodate a larger number of users.
arXiv Detail & Related papers (2024-02-28T03:27:10Z) - HierSFL: Local Differential Privacy-aided Split Federated Learning in
Mobile Edge Computing [7.180235086275924]
Federated Learning is a promising approach for learning from user data while preserving data privacy.
Split Federated Learning is utilized, where clients upload their intermediate model training outcomes to a cloud server for collaborative server-client model training.
This methodology facilitates resource-constrained clients' participation in model training but also increases the training time and communication overhead.
We propose a novel algorithm, called Hierarchical Split Federated Learning (HierSFL), that amalgamates models at the edge and cloud phases.
arXiv Detail & Related papers (2024-01-16T09:34:10Z) - Tunable Soft Prompts are Messengers in Federated Learning [55.924749085481544]
Federated learning (FL) enables multiple participants to collaboratively train machine learning models using decentralized data sources.
The lack of model privacy protection in FL becomes an unneglectable challenge.
We propose a novel FL training approach that accomplishes information exchange among participants via tunable soft prompts.
arXiv Detail & Related papers (2023-11-12T11:01:10Z) - SemiSFL: Split Federated Learning on Unlabeled and Non-IID Data [34.49090830845118]
Federated Learning (FL) has emerged to allow multiple clients to collaboratively train machine learning models on their private data at the network edge.
We propose a novel Semi-supervised SFL system, termed SemiSFL, which incorporates clustering regularization to perform SFL with unlabeled and non-IID client data.
Our system provides a 3.8x speed-up in training time, reduces the communication cost by about 70.3% while reaching the target accuracy, and achieves up to 5.8% improvement in accuracy under non-IID scenarios.
arXiv Detail & Related papers (2023-07-29T02:35:37Z) - Federated Learning of Shareable Bases for Personalization-Friendly Image
Classification [54.72892987840267]
FedBasis learns a set of few shareable basis'' models, which can be linearly combined to form personalized models for clients.
Specifically for a new client, only a small set of combination coefficients, not the model weights, needs to be learned.
To demonstrate the effectiveness and applicability of FedBasis, we also present a more practical PFL testbed for image classification.
arXiv Detail & Related papers (2023-04-16T20:19:18Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - Federated Select: A Primitive for Communication- and Memory-Efficient
Federated Learning [4.873569522869751]
Federated learning (FL) is a framework for machine learning across heterogeneous client devices.
We propose a more general procedure in which clients "select" what values are sent to them.
This allows clients to operate on smaller, data-dependent slices.
arXiv Detail & Related papers (2022-08-19T16:26:03Z) - No One Left Behind: Inclusive Federated Learning over Heterogeneous
Devices [79.16481453598266]
We propose InclusiveFL, a client-inclusive federated learning method to handle this problem.
The core idea of InclusiveFL is to assign models of different sizes to clients with different computing capabilities.
We also propose an effective method to share the knowledge among multiple local models with different sizes.
arXiv Detail & Related papers (2022-02-16T13:03:27Z) - Inference-Time Personalized Federated Learning [17.60724466773559]
Inference-Time PFL (IT-PFL) is where a model trained on a set of clients needs to be later evaluated on novel unlabeled clients at inference time.
We propose a novel approach to this problem IT-PFL-HN, based on a hypernetwork module and an encoder module.
We find that IT-PFL-HN generalizes better than current FL and PFL methods, especially when the novel client has a large domain shift.
arXiv Detail & Related papers (2021-11-16T10:57:20Z) - SemiFL: Communication Efficient Semi-Supervised Federated Learning with
Unlabeled Clients [34.24028216079336]
We propose a new Federated Learning framework referred to as SemiFL.
In SemiFL, clients have completely unlabeled data, while the server has a small amount of labeled data.
We demonstrate various efficient strategies of SemiFL that enhance learning performance.
arXiv Detail & Related papers (2021-06-02T19:22:26Z) - LotteryFL: Personalized and Communication-Efficient Federated Learning
with Lottery Ticket Hypothesis on Non-IID Datasets [52.60094373289771]
Federated learning is a popular distributed machine learning paradigm with enhanced privacy.
We propose LotteryFL -- a personalized and communication-efficient federated learning framework.
We show that LotteryFL significantly outperforms existing solutions in terms of personalization and communication cost.
arXiv Detail & Related papers (2020-08-07T20:45:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.