Secure short-term load forecasting for smart grids with
transformer-based federated learning
- URL: http://arxiv.org/abs/2310.17477v1
- Date: Thu, 26 Oct 2023 15:27:55 GMT
- Title: Secure short-term load forecasting for smart grids with
transformer-based federated learning
- Authors: Jonas Sievers, Thomas Blank
- Abstract summary: Electricity load forecasting is an essential task within smart grids to assist demand and supply balance.
Fine-grained load profiles can expose users' electricity consumption behaviors, which raises privacy and security concerns.
This paper presents a novel transformer-based deep learning approach with federated learning for short-term electricity load prediction.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Electricity load forecasting is an essential task within smart grids to
assist demand and supply balance. While advanced deep learning models require
large amounts of high-resolution data for accurate short-term load predictions,
fine-grained load profiles can expose users' electricity consumption behaviors,
which raises privacy and security concerns. One solution to improve data
privacy is federated learning, where models are trained locally on private
data, and only the trained model parameters are merged and updated on a global
server. Therefore, this paper presents a novel transformer-based deep learning
approach with federated learning for short-term electricity load prediction. To
evaluate our results, we benchmark our federated learning architecture against
central and local learning and compare the performance of our model to long
short-term memory models and convolutional neural networks. Our simulations are
based on a dataset from a German university campus and show that
transformer-based forecasting is a promising alternative to state-of-the-art
models within federated learning.
Related papers
- Addressing Heterogeneity in Federated Load Forecasting with Personalization Layers [3.933147844455233]
We propose the use of personalization layers for load forecasting in a general framework called PL-FL.
We show that PL-FL outperforms FL and purely local training, while requiring lower communication bandwidth than FL.
arXiv Detail & Related papers (2024-04-01T22:53:09Z) - Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - Grid Frequency Forecasting in University Campuses using Convolutional
LSTM [0.0]
This paper harnesses Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM) networks to establish robust time forecasting models for grid frequency.
Individual ConvLSTM models are trained on power consumption data for each campus building and forecast the grid frequency based on historical trends.
An Ensemble Model is formulated to aggregate insights from the building-specific models, delivering comprehensive forecasts for the entire campus.
arXiv Detail & Related papers (2023-10-24T13:53:51Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Transfer Learning in Deep Learning Models for Building Load Forecasting:
Case of Limited Data [0.0]
This paper proposes a Building-to-Building Transfer Learning framework to overcome the problem and enhance the performance of Deep Learning models.
The proposed approach improved the forecasting accuracy by 56.8% compared to the case of conventional deep learning where training from scratch is used.
arXiv Detail & Related papers (2023-01-25T16:05:47Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - Augmented Bilinear Network for Incremental Multi-Stock Time-Series
Classification [83.23129279407271]
We propose a method to efficiently retain the knowledge available in a neural network pre-trained on a set of securities.
In our method, the prior knowledge encoded in a pre-trained neural network is maintained by keeping existing connections fixed.
This knowledge is adjusted for the new securities by a set of augmented connections, which are optimized using the new data.
arXiv Detail & Related papers (2022-07-23T18:54:10Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Federated Learning with Hyperparameter-based Clustering for Electrical
Load Forecasting [0.0]
This paper evaluates the performance of federated learning for short-term forecasting of individual house loads as well as the aggregate load.
It discusses the advantages and disadvantages of this method by comparing it to centralized and local learning schemes.
The results show that federated learning has a good performance with a minimum root mean squared error (RMSE) of 0.117kWh for individual load forecasting.
arXiv Detail & Related papers (2021-11-14T22:29:01Z) - Random vector functional link neural network based ensemble deep
learning for short-term load forecasting [14.184042046855884]
This paper proposes a novel ensemble deep Random Functional Link (edRVFL) network for electricity load forecasting.
The hidden layers are stacked to enforce deep representation learning.
The model generates the forecasts by ensembling the outputs of each layer.
arXiv Detail & Related papers (2021-07-30T01:20:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.