Federated Learning with Hyperparameter-based Clustering for Electrical
Load Forecasting
- URL: http://arxiv.org/abs/2111.07462v1
- Date: Sun, 14 Nov 2021 22:29:01 GMT
- Title: Federated Learning with Hyperparameter-based Clustering for Electrical
Load Forecasting
- Authors: Nastaran Gholizadeh, Petr Musilek
- Abstract summary: This paper evaluates the performance of federated learning for short-term forecasting of individual house loads as well as the aggregate load.
It discusses the advantages and disadvantages of this method by comparing it to centralized and local learning schemes.
The results show that federated learning has a good performance with a minimum root mean squared error (RMSE) of 0.117kWh for individual load forecasting.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Electrical load prediction has become an integral part of power system
operation. Deep learning models have found popularity for this purpose.
However, to achieve a desired prediction accuracy, they require huge amounts of
data for training. Sharing electricity consumption data of individual
households for load prediction may compromise user privacy and can be expensive
in terms of communication resources. Therefore, edge computing methods, such as
federated learning, are gaining more importance for this purpose. These methods
can take advantage of the data without centrally storing it. This paper
evaluates the performance of federated learning for short-term forecasting of
individual house loads as well as the aggregate load. It discusses the
advantages and disadvantages of this method by comparing it to centralized and
local learning schemes. Moreover, a new client clustering method is proposed to
reduce the convergence time of federated learning. The results show that
federated learning has a good performance with a minimum root mean squared
error (RMSE) of 0.117kWh for individual load forecasting.
Related papers
- SpaFL: Communication-Efficient Federated Learning with Sparse Models and Low computational Overhead [75.87007729801304]
SpaFL: a communication-efficient FL framework is proposed to optimize sparse model structures with low computational overhead.
Experiments show that SpaFL improves accuracy while requiring much less communication and computing resources compared to sparse baselines.
arXiv Detail & Related papers (2024-06-01T13:10:35Z) - Secure short-term load forecasting for smart grids with
transformer-based federated learning [0.0]
Electricity load forecasting is an essential task within smart grids to assist demand and supply balance.
Fine-grained load profiles can expose users' electricity consumption behaviors, which raises privacy and security concerns.
This paper presents a novel transformer-based deep learning approach with federated learning for short-term electricity load prediction.
arXiv Detail & Related papers (2023-10-26T15:27:55Z) - Adaptive Model Pruning and Personalization for Federated Learning over
Wireless Networks [72.59891661768177]
Federated learning (FL) enables distributed learning across edge devices while protecting data privacy.
We consider a FL framework with partial model pruning and personalization to overcome these challenges.
This framework splits the learning model into a global part with model pruning shared with all devices to learn data representations and a personalized part to be fine-tuned for a specific device.
arXiv Detail & Related papers (2023-09-04T21:10:45Z) - Tackling Computational Heterogeneity in FL: A Few Theoretical Insights [68.8204255655161]
We introduce and analyse a novel aggregation framework that allows for formalizing and tackling computational heterogeneous data.
Proposed aggregation algorithms are extensively analyzed from a theoretical, and an experimental prospective.
arXiv Detail & Related papers (2023-07-12T16:28:21Z) - A Meta-Learning Approach to Predicting Performance and Data Requirements [163.4412093478316]
We propose an approach to estimate the number of samples required for a model to reach a target performance.
We find that the power law, the de facto principle to estimate model performance, leads to large error when using a small dataset.
We introduce a novel piecewise power law (PPL) that handles the two data differently.
arXiv Detail & Related papers (2023-03-02T21:48:22Z) - Federated Learning for 5G Base Station Traffic Forecasting [0.0]
We investigate the efficacy of distributed learning applied to raw base station LTE data for time-series forecasting.
Our results show that the learning architectures adapted to the federated setting yield equivalent prediction error to the centralized setting.
In addition, preprocessing techniques on base stations enhance forecasting accuracy, while advanced federated aggregators do not surpass simpler approaches.
arXiv Detail & Related papers (2022-11-28T11:03:29Z) - Privacy-preserving household load forecasting based on non-intrusive
load monitoring: A federated deep learning approach [3.0584272247900577]
We first propose a household load forecasting method based on federated deep learning and non-intrusive load monitoring (NILM)
The integrated power is decomposed into individual device power by non-intrusive load monitoring, and the power of individual appliances is predicted separately using a federated deep learning model.
arXiv Detail & Related papers (2022-06-30T11:13:26Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Federated Stochastic Gradient Descent Begets Self-Induced Momentum [151.4322255230084]
Federated learning (FL) is an emerging machine learning method that can be applied in mobile edge systems.
We show that running to the gradient descent (SGD) in such a setting can be viewed as adding a momentum-like term to the global aggregation process.
arXiv Detail & Related papers (2022-02-17T02:01:37Z) - Appliance Level Short-term Load Forecasting via Recurrent Neural Network [6.351541960369854]
We present an STLF algorithm for efficiently predicting the power consumption of individual electrical appliances.
The proposed method builds upon a powerful recurrent neural network (RNN) architecture in deep learning.
arXiv Detail & Related papers (2021-11-23T16:56:37Z) - Federated Learning for Short-term Residential Energy Demand Forecasting [4.769747792846004]
Energy demand forecasting is an essential task performed within the energy industry to help balance supply with demand and maintain a stable load on the electricity grid.
As supply transitions towards less reliable renewable energy generation, smart meters will prove a vital component to aid these forecasting tasks.
However, smart meter take-up is low among privacy-conscious consumers that fear intrusion upon their fine-grained consumption data.
arXiv Detail & Related papers (2021-05-27T17:33:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.