Federated Learning based Energy Demand Prediction with Clustered
Aggregation
- URL: http://arxiv.org/abs/2210.15850v1
- Date: Fri, 28 Oct 2022 02:46:59 GMT
- Title: Federated Learning based Energy Demand Prediction with Clustered
Aggregation
- Authors: Ye Lin Tun, Kyi Thar, Chu Myaet Thwal, Choong Seon Hong
- Abstract summary: Energy usage information collected by the clients' smart homes can be used to train a deep neural network to predict the future energy demand.
In this paper, we propose a recurrent neural network based energy demand predictor trained with federated learning on clustered clients to take advantage of distributed data.
- Score: 14.631304404306778
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To reduce negative environmental impacts, power stations and energy grids
need to optimize the resources required for power production. Thus, predicting
the energy consumption of clients is becoming an important part of every energy
management system. Energy usage information collected by the clients' smart
homes can be used to train a deep neural network to predict the future energy
demand. Collecting data from a large number of distributed clients for
centralized model training is expensive in terms of communication resources. To
take advantage of distributed data in edge systems, centralized training can be
replaced by federated learning where each client only needs to upload model
updates produced by training on its local data. These model updates are
aggregated into a single global model by the server. But since different
clients can have different attributes, model updates can have diverse weights
and as a result, it can take a long time for the aggregated global model to
converge. To speed up the convergence process, we can apply clustering to group
clients based on their properties and aggregate model updates from the same
cluster together to produce a cluster specific global model. In this paper, we
propose a recurrent neural network based energy demand predictor, trained with
federated learning on clustered clients to take advantage of distributed data
and speed up the convergence process.
Related papers
- FedSPD: A Soft-clustering Approach for Personalized Decentralized Federated Learning [18.38030098837294]
Federated learning is a framework for distributed clients to collaboratively train a machine learning model using local data.
We propose FedSPD, an efficient personalized federated learning algorithm for the decentralized setting.
We show that FedSPD learns accurate models even in low-connectivity networks.
arXiv Detail & Related papers (2024-10-24T15:48:34Z) - Towards Client Driven Federated Learning [7.528642177161784]
We introduce Client-Driven Federated Learning (CDFL), a novel FL framework that puts clients at the driving role.
In CDFL, each client independently and asynchronously updates its model by uploading the locally trained model to the server and receiving a customized model tailored to its local task.
arXiv Detail & Related papers (2024-05-24T10:17:49Z) - Clustering-based Multitasking Deep Neural Network for Solar Photovoltaics Power Generation Prediction [16.263501526929975]
We propose a multitasking deep neural network (CM-DNN) framework for PV power generation prediction.
For each type, a deep neural network (DNN) is employed and trained until the accuracy cannot be improved.
For a specified customer type, inter-model knowledge transfer is conducted to enhance its training accuracy.
The proposed CM-DNN is tested on a real-world PV power generation dataset.
arXiv Detail & Related papers (2024-05-09T00:08:21Z) - Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - Heterogeneous Federated Learning via Personalized Generative Networks [7.629157720712401]
Federated Learning (FL) allows several clients to construct a common global machine-learning model without having to share their data.
We propose a method for knowledge transfer between clients where the server trains client-specific generators.
arXiv Detail & Related papers (2023-08-25T09:37:02Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - DYNAFED: Tackling Client Data Heterogeneity with Global Dynamics [60.60173139258481]
Local training on non-iid distributed data results in deflected local optimum.
A natural solution is to gather all client data onto the server, such that the server has a global view of the entire data distribution.
In this paper, we put forth an idea to collect and leverage global knowledge on the server without hindering data privacy.
arXiv Detail & Related papers (2022-11-20T06:13:06Z) - Data Selection for Efficient Model Update in Federated Learning [0.07614628596146598]
We propose to reduce the amount of local data that is needed to train a global model.
We do this by splitting the model into a lower part for generic feature extraction and an upper part that is more sensitive to the characteristics of the local data.
Our experiments show that less than 1% of the local data can transfer the characteristics of the client data to the global model.
arXiv Detail & Related papers (2021-11-05T14:07:06Z) - FedKD: Communication Efficient Federated Learning via Knowledge
Distillation [56.886414139084216]
Federated learning is widely used to learn intelligent models from decentralized data.
In federated learning, clients need to communicate their local model updates in each iteration of model learning.
We propose a communication efficient federated learning method based on knowledge distillation.
arXiv Detail & Related papers (2021-08-30T15:39:54Z) - Personalized Federated Learning with First Order Model Optimization [76.81546598985159]
We propose an alternative to federated learning, where each client federates with other relevant clients to obtain a stronger model per client-specific objectives.
We do not assume knowledge of underlying data distributions or client similarities, and allow each client to optimize for arbitrary target distributions of interest.
Our method outperforms existing alternatives, while also enabling new features for personalized FL such as transfer outside of local data distributions.
arXiv Detail & Related papers (2020-12-15T19:30:29Z) - Information-Theoretic Bounds on the Generalization Error and Privacy
Leakage in Federated Learning [96.38757904624208]
Machine learning algorithms on mobile networks can be characterized into three different categories.
The main objective of this work is to provide an information-theoretic framework for all of the aforementioned learning paradigms.
arXiv Detail & Related papers (2020-05-05T21:23:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.