Addressing Heterogeneity in Federated Load Forecasting with Personalization Layers
- URL: http://arxiv.org/abs/2404.01517v1
- Date: Mon, 1 Apr 2024 22:53:09 GMT
- Title: Addressing Heterogeneity in Federated Load Forecasting with Personalization Layers
- Authors: Shourya Bose, Yu Zhang, Kibaek Kim,
- Abstract summary: We propose the use of personalization layers for load forecasting in a general framework called PL-FL.
We show that PL-FL outperforms FL and purely local training, while requiring lower communication bandwidth than FL.
- Score: 3.933147844455233
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The advent of smart meters has enabled pervasive collection of energy consumption data for training short-term load forecasting models. In response to privacy concerns, federated learning (FL) has been proposed as a privacy-preserving approach for training, but the quality of trained models degrades as client data becomes heterogeneous. In this paper we propose the use of personalization layers for load forecasting in a general framework called PL-FL. We show that PL-FL outperforms FL and purely local training, while requiring lower communication bandwidth than FL. This is done through extensive simulations on three different datasets from the NREL ComStock repository.
Related papers
- Privacy-Preserving Load Forecasting via Personalized Model Obfuscation [4.420464017266168]
This paper addresses the performance challenges of short-term load forecasting models trained with federated learning on heterogeneous data.
Our proposed algorithm, Privacy Preserving Federated Learning (PPFL), incorporates personalization layers for localized training at each smart meter.
arXiv Detail & Related papers (2023-11-21T03:03:10Z) - Tunable Soft Prompts are Messengers in Federated Learning [55.924749085481544]
Federated learning (FL) enables multiple participants to collaboratively train machine learning models using decentralized data sources.
The lack of model privacy protection in FL becomes an unneglectable challenge.
We propose a novel FL training approach that accomplishes information exchange among participants via tunable soft prompts.
arXiv Detail & Related papers (2023-11-12T11:01:10Z) - Secure short-term load forecasting for smart grids with
transformer-based federated learning [0.0]
Electricity load forecasting is an essential task within smart grids to assist demand and supply balance.
Fine-grained load profiles can expose users' electricity consumption behaviors, which raises privacy and security concerns.
This paper presents a novel transformer-based deep learning approach with federated learning for short-term electricity load prediction.
arXiv Detail & Related papers (2023-10-26T15:27:55Z) - PRIOR: Personalized Prior for Reactivating the Information Overlooked in
Federated Learning [16.344719695572586]
We propose a novel scheme to inject personalized prior knowledge into a global model in each client.
At the heart of our proposed approach is a framework, the PFL with Bregman Divergence (pFedBreD)
Our method reaches the state-of-the-art performances on 5 datasets and outperforms other methods by up to 3.5% across 8 benchmarks.
arXiv Detail & Related papers (2023-10-13T15:21:25Z) - Semi-Federated Learning: Convergence Analysis and Optimization of A
Hybrid Learning Framework [70.83511997272457]
We propose a semi-federated learning (SemiFL) paradigm to leverage both the base station (BS) and devices for a hybrid implementation of centralized learning (CL) and FL.
We propose a two-stage algorithm to solve this intractable problem, in which we provide the closed-form solutions to the beamformers.
arXiv Detail & Related papers (2023-10-04T03:32:39Z) - Federated Short-Term Load Forecasting with Personalization Layers for
Heterogeneous Clients [0.7252027234425334]
We propose a personalized FL algorithm (PL-FL) enabling FL to handle personalization layers.
PL-FL is implemented by using the Argonne Privacy-Preserving Federated Learning package.
We test the forecast performance of models trained on the NREL ComStock dataset.
arXiv Detail & Related papers (2023-09-22T21:57:52Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - Personalized Federated Learning under Mixture of Distributions [98.25444470990107]
We propose a novel approach to Personalized Federated Learning (PFL), which utilizes Gaussian mixture models (GMM) to fit the input data distributions across diverse clients.
FedGMM possesses an additional advantage of adapting to new clients with minimal overhead, and it also enables uncertainty quantification.
Empirical evaluations on synthetic and benchmark datasets demonstrate the superior performance of our method in both PFL classification and novel sample detection.
arXiv Detail & Related papers (2023-05-01T20:04:46Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Federated Learning for Short-term Residential Energy Demand Forecasting [4.769747792846004]
Energy demand forecasting is an essential task performed within the energy industry to help balance supply with demand and maintain a stable load on the electricity grid.
As supply transitions towards less reliable renewable energy generation, smart meters will prove a vital component to aid these forecasting tasks.
However, smart meter take-up is low among privacy-conscious consumers that fear intrusion upon their fine-grained consumption data.
arXiv Detail & Related papers (2021-05-27T17:33:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.