Federated Short-Term Load Forecasting with Personalization Layers for
Heterogeneous Clients
- URL: http://arxiv.org/abs/2309.13194v1
- Date: Fri, 22 Sep 2023 21:57:52 GMT
- Title: Federated Short-Term Load Forecasting with Personalization Layers for
Heterogeneous Clients
- Authors: Shourya Bose and Kibaek Kim
- Abstract summary: We propose a personalized FL algorithm (PL-FL) enabling FL to handle personalization layers.
PL-FL is implemented by using the Argonne Privacy-Preserving Federated Learning package.
We test the forecast performance of models trained on the NREL ComStock dataset.
- Score: 0.7252027234425334
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The advent of smart meters has enabled pervasive collection of energy
consumption data for training short-term load forecasting (STLF) models. In
response to privacy concerns, federated learning (FL) has been proposed as a
privacy-preserving approach for training, but the quality of trained models
degrades as client data becomes heterogeneous. In this paper we alleviate this
drawback using personalization layers, wherein certain layers of an STLF model
in an FL framework are trained exclusively on the clients' own data. To that
end, we propose a personalized FL algorithm (PL-FL) enabling FL to handle
personalization layers. The PL-FL algorithm is implemented by using the Argonne
Privacy-Preserving Federated Learning package. We test the forecast performance
of models trained on the NREL ComStock dataset, which contains heterogeneous
energy consumption data of multiple commercial buildings. Superior performance
of models trained with PL-FL demonstrates that personalization layers enable
classical FL algorithms to handle clients with heterogeneous data.
Related papers
- An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - Addressing Heterogeneity in Federated Load Forecasting with Personalization Layers [3.933147844455233]
We propose the use of personalization layers for load forecasting in a general framework called PL-FL.
We show that PL-FL outperforms FL and purely local training, while requiring lower communication bandwidth than FL.
arXiv Detail & Related papers (2024-04-01T22:53:09Z) - Personalized Federated Learning of Probabilistic Models: A PAC-Bayesian
Approach [42.59649764999974]
Federated learning aims to infer a shared model from private and decentralized data stored locally by multiple clients.
We propose a PFL algorithm named PAC-PFL for learning probabilistic models within a PAC-Bayesian framework.
Our algorithm collaboratively learns a shared hyper-posterior and regards each client's posterior inference as the step personalization.
arXiv Detail & Related papers (2024-01-16T13:30:37Z) - Contrastive encoder pre-training-based clustered federated learning for
heterogeneous data [17.580390632874046]
Federated learning (FL) enables distributed clients to collaboratively train a global model while preserving their data privacy.
We propose contrastive pre-training-based clustered federated learning (CP-CFL) to improve the model convergence and overall performance of FL systems.
arXiv Detail & Related papers (2023-11-28T05:44:26Z) - Privacy-Preserving Load Forecasting via Personalized Model Obfuscation [4.420464017266168]
This paper addresses the performance challenges of short-term load forecasting models trained with federated learning on heterogeneous data.
Our proposed algorithm, Privacy Preserving Federated Learning (PPFL), incorporates personalization layers for localized training at each smart meter.
arXiv Detail & Related papers (2023-11-21T03:03:10Z) - PRIOR: Personalized Prior for Reactivating the Information Overlooked in
Federated Learning [16.344719695572586]
We propose a novel scheme to inject personalized prior knowledge into a global model in each client.
At the heart of our proposed approach is a framework, the PFL with Bregman Divergence (pFedBreD)
Our method reaches the state-of-the-art performances on 5 datasets and outperforms other methods by up to 3.5% across 8 benchmarks.
arXiv Detail & Related papers (2023-10-13T15:21:25Z) - ZooPFL: Exploring Black-box Foundation Models for Personalized Federated
Learning [95.64041188351393]
This paper endeavors to solve both the challenges of limited resources and personalization.
We propose a method named ZOOPFL that uses Zeroth-Order Optimization for Personalized Federated Learning.
To reduce the computation costs and enhance personalization, we propose input surgery to incorporate an auto-encoder with low-dimensional and client-specific embeddings.
arXiv Detail & Related papers (2023-10-08T12:26:13Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - Personalized Federated Learning under Mixture of Distributions [98.25444470990107]
We propose a novel approach to Personalized Federated Learning (PFL), which utilizes Gaussian mixture models (GMM) to fit the input data distributions across diverse clients.
FedGMM possesses an additional advantage of adapting to new clients with minimal overhead, and it also enables uncertainty quantification.
Empirical evaluations on synthetic and benchmark datasets demonstrate the superior performance of our method in both PFL classification and novel sample detection.
arXiv Detail & Related papers (2023-05-01T20:04:46Z) - Federated Learning of Shareable Bases for Personalization-Friendly Image
Classification [54.72892987840267]
FedBasis learns a set of few shareable basis'' models, which can be linearly combined to form personalized models for clients.
Specifically for a new client, only a small set of combination coefficients, not the model weights, needs to be learned.
To demonstrate the effectiveness and applicability of FedBasis, we also present a more practical PFL testbed for image classification.
arXiv Detail & Related papers (2023-04-16T20:19:18Z) - Visual Prompt Based Personalized Federated Learning [83.04104655903846]
We propose a novel PFL framework for image classification tasks, dubbed pFedPT, that leverages personalized visual prompts to implicitly represent local data distribution information of clients.
Experiments on the CIFAR10 and CIFAR100 datasets show that pFedPT outperforms several state-of-the-art (SOTA) PFL algorithms by a large margin in various settings.
arXiv Detail & Related papers (2023-03-15T15:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.