FedTrees: A Novel Computation-Communication Efficient Federated Learning
Framework Investigated in Smart Grids
- URL: http://arxiv.org/abs/2210.00060v1
- Date: Fri, 30 Sep 2022 19:47:46 GMT
- Title: FedTrees: A Novel Computation-Communication Efficient Federated Learning
Framework Investigated in Smart Grids
- Authors: Mohammad Al-Quraan, Ahsan Khan, Anthony Centeno, Ahmed Zoha, Muhammad
Ali Imran, Lina Mohjazi
- Abstract summary: Next-generation smart meters can be used to measure, record, and report energy consumption data.
FedTrees is a new, lightweight FL framework that benefits from the outstanding features of ensemble learning.
- Score: 8.437758224218648
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Smart energy performance monitoring and optimisation at the supplier and
consumer levels is essential to realising smart cities. In order to implement a
more sustainable energy management plan, it is crucial to conduct a better
energy forecast. The next-generation smart meters can also be used to measure,
record, and report energy consumption data, which can be used to train machine
learning (ML) models for predicting energy needs. However, sharing fine-grained
energy data and performing centralised learning may compromise users' privacy
and leave them vulnerable to several attacks. This study addresses this issue
by utilising federated learning (FL), an emerging technique that performs ML
model training at the user level, where data resides. We introduce FedTrees, a
new, lightweight FL framework that benefits from the outstanding features of
ensemble learning. Furthermore, we developed a delta-based early stopping
algorithm to monitor FL training and stop it when it does not need to continue.
The simulation results demonstrate that FedTrees outperforms the most popular
federated averaging (FedAvg) framework and the baseline Persistence model for
providing accurate energy forecasting patterns while taking only 2% of the
computation time and 13% of the communication rounds compared to FedAvg, saving
considerable amounts of computation and communication resources.
Related papers
- Exploring Lightweight Federated Learning for Distributed Load Forecasting [0.864902991835914]
Federated Learning (FL) is a distributed learning scheme that enables deep learning to be applied to sensitive data streams and applications in a privacy-preserving manner.
We show that with a lightweight fully connected deep neural network, we are able to achieve forecasting accuracy comparable to existing schemes.
arXiv Detail & Related papers (2024-04-04T09:35:48Z) - FLrce: Resource-Efficient Federated Learning with Early-Stopping Strategy [7.963276533979389]
Federated Learning (FL) achieves great popularity in the Internet of Things (IoT)
We present FLrce, an efficient FL framework with a relationship-based client selection and early-stopping strategy.
Experiment results show that, compared with existing efficient FL frameworks, FLrce improves the computation and communication efficiency by at least 30% and 43% respectively.
arXiv Detail & Related papers (2023-10-15T10:13:44Z) - Federated Fine-Tuning of LLMs on the Very Edge: The Good, the Bad, the Ugly [62.473245910234304]
This paper takes a hardware-centric approach to explore how Large Language Models can be brought to modern edge computing systems.
We provide a micro-level hardware benchmark, compare the model FLOP utilization to a state-of-the-art data center GPU, and study the network utilization in realistic conditions.
arXiv Detail & Related papers (2023-10-04T20:27:20Z) - FedWOA: A Federated Learning Model that uses the Whale Optimization
Algorithm for Renewable Energy Prediction [0.0]
This paper introduces FedWOA, a novel federated learning model that aggregate global prediction models from the weights of local neural network models trained on prosumer energy data.
The evaluation results on prosumers energy data have shown that FedWOA can effectively enhance the accuracy of energy prediction models accuracy by 25% for MSE and 16% for MAE compared to FedAVG.
arXiv Detail & Related papers (2023-09-19T05:44:18Z) - Exploring Deep Reinforcement Learning-Assisted Federated Learning for
Online Resource Allocation in EdgeIoT [53.68792408315411]
Federated learning (FL) has been increasingly considered to preserve data training privacy from eavesdropping attacks in mobile edge computing-based Internet of Thing (EdgeIoT)
We propose a new federated learning-enabled twin-delayed deep deterministic policy gradient (FLDLT3) framework to achieve the optimal accuracy and energy balance in a continuous domain.
Numerical results demonstrate that the proposed FL-DLT3 achieves fast convergence (less than 100 iterations) while the FL accuracy-to-energy consumption ratio is improved by 51.8% compared to existing state-of-the-art benchmark.
arXiv Detail & Related papers (2022-02-15T13:36:15Z) - Federated Learning for Short-term Residential Energy Demand Forecasting [4.769747792846004]
Energy demand forecasting is an essential task performed within the energy industry to help balance supply with demand and maintain a stable load on the electricity grid.
As supply transitions towards less reliable renewable energy generation, smart meters will prove a vital component to aid these forecasting tasks.
However, smart meter take-up is low among privacy-conscious consumers that fear intrusion upon their fine-grained consumption data.
arXiv Detail & Related papers (2021-05-27T17:33:09Z) - A Framework for Energy and Carbon Footprint Analysis of Distributed and
Federated Edge Learning [48.63610479916003]
This article breaks down and analyzes the main factors that influence the environmental footprint of distributed learning policies.
It models both vanilla and decentralized FL policies driven by consensus.
Results show that FL allows remarkable end-to-end energy savings (30%-40%) for wireless systems characterized by low bit/Joule efficiency.
arXiv Detail & Related papers (2021-03-18T16:04:42Z) - To Talk or to Work: Flexible Communication Compression for Energy
Efficient Federated Learning over Heterogeneous Mobile Edge Devices [78.38046945665538]
federated learning (FL) over massive mobile edge devices opens new horizons for numerous intelligent mobile applications.
FL imposes huge communication and computation burdens on participating devices due to periodical global synchronization and continuous local training.
We develop a convergence-guaranteed FL algorithm enabling flexible communication compression.
arXiv Detail & Related papers (2020-12-22T02:54:18Z) - Resource Management for Blockchain-enabled Federated Learning: A Deep
Reinforcement Learning Approach [54.29213445674221]
Federated Learning (BFL) enables mobile devices to collaboratively train neural network models required by a Machine Learning Model Owner (MLMO)
The issue of BFL is that the mobile devices have energy and CPU constraints that may reduce the system lifetime and training efficiency.
We propose to use the Deep Reinforcement Learning (DRL) to derive the optimal decisions for theO.
arXiv Detail & Related papers (2020-04-08T16:29:19Z) - Multi-Agent Meta-Reinforcement Learning for Self-Powered and Sustainable
Edge Computing Systems [87.4519172058185]
An effective energy dispatch mechanism for self-powered wireless networks with edge computing capabilities is studied.
A novel multi-agent meta-reinforcement learning (MAMRL) framework is proposed to solve the formulated problem.
Experimental results show that the proposed MAMRL model can reduce up to 11% non-renewable energy usage and by 22.4% the energy cost.
arXiv Detail & Related papers (2020-02-20T04:58:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.