A Framework for Energy and Carbon Footprint Analysis of Distributed and
Federated Edge Learning
- URL: http://arxiv.org/abs/2103.10346v1
- Date: Thu, 18 Mar 2021 16:04:42 GMT
- Title: A Framework for Energy and Carbon Footprint Analysis of Distributed and
Federated Edge Learning
- Authors: Stefano Savazzi, Sanaz Kianoush, Vittorio Rampa, Mehdi Bennis
- Abstract summary: This article breaks down and analyzes the main factors that influence the environmental footprint of distributed learning policies.
It models both vanilla and decentralized FL policies driven by consensus.
Results show that FL allows remarkable end-to-end energy savings (30%-40%) for wireless systems characterized by low bit/Joule efficiency.
- Score: 48.63610479916003
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent advances in distributed learning raise environmental concerns due to
the large energy needed to train and move data to/from data centers. Novel
paradigms, such as federated learning (FL), are suitable for decentralized
model training across devices or silos that simultaneously act as both data
producers and learners. Unlike centralized learning (CL) techniques, relying on
big-data fusion and analytics located in energy hungry data centers, in FL
scenarios devices collaboratively train their models without sharing their
private data. This article breaks down and analyzes the main factors that
influence the environmental footprint of FL policies compared with classical
CL/Big-Data algorithms running in data centers. The proposed analytical
framework takes into account both learning and communication energy costs, as
well as the carbon equivalent emissions; in addition, it models both vanilla
and decentralized FL policies driven by consensus. The framework is evaluated
in an industrial setting assuming a real-world robotized workplace. Results
show that FL allows remarkable end-to-end energy savings (30%-40%) for wireless
systems characterized by low bit/Joule efficiency (50 kbit/Joule or lower).
Consensus-driven FL does not require the parameter server and further reduces
emissions in mesh networks (200 kbit/Joule). On the other hand, all FL policies
are slower to converge when local data are unevenly distributed (often 2x
slower than CL). Energy footprint and learning loss can be traded off to
optimize efficiency.
Related papers
- A Framework for testing Federated Learning algorithms using an edge-like environment [0.0]
Federated Learning (FL) is a machine learning paradigm in which many clients cooperatively train a single centralized model while keeping their data private and decentralized.
It is non-trivial to accurately evaluate the contributions of local models in global centralized model aggregation.
This is an example of a major challenge in FL, commonly known as data imbalance or class imbalance.
In this work, a framework is proposed and implemented to assess FL algorithms in a more easy and scalable way.
arXiv Detail & Related papers (2024-07-17T19:52:53Z) - FLrce: Resource-Efficient Federated Learning with Early-Stopping Strategy [7.963276533979389]
Federated Learning (FL) achieves great popularity in the Internet of Things (IoT)
We present FLrce, an efficient FL framework with a relationship-based client selection and early-stopping strategy.
Experiment results show that, compared with existing efficient FL frameworks, FLrce improves the computation and communication efficiency by at least 30% and 43% respectively.
arXiv Detail & Related papers (2023-10-15T10:13:44Z) - A Carbon Tracking Model for Federated Learning: Impact of Quantization and Sparsification [5.341266334051207]
Federated Learning (FL) methods adopt efficient communication technologies to distribute machine learning tasks across edge devices.
This paper proposes a framework for real-time monitoring of the energy and carbon footprint impacts of FL systems.
arXiv Detail & Related papers (2023-10-12T07:20:03Z) - Semi-Federated Learning: Convergence Analysis and Optimization of A
Hybrid Learning Framework [70.83511997272457]
We propose a semi-federated learning (SemiFL) paradigm to leverage both the base station (BS) and devices for a hybrid implementation of centralized learning (CL) and FL.
We propose a two-stage algorithm to solve this intractable problem, in which we provide the closed-form solutions to the beamformers.
arXiv Detail & Related papers (2023-10-04T03:32:39Z) - Green Federated Learning [7.003870178055125]
Federated Learning (FL) is a machine learning technique for training a centralized model using data of decentralized entities.
FL may leverage as many as hundreds of millions of globally distributed end-user devices with diverse energy sources.
We propose the concept of Green FL, which involves optimizing FL parameters and making design choices to minimize carbon emissions.
arXiv Detail & Related papers (2023-03-26T02:23:38Z) - Online Data Selection for Federated Learning with Limited Storage [53.46789303416799]
Federated Learning (FL) has been proposed to achieve distributed machine learning among networked devices.
The impact of on-device storage on the performance of FL is still not explored.
In this work, we take the first step to consider the online data selection for FL with limited on-device storage.
arXiv Detail & Related papers (2022-09-01T03:27:33Z) - An Energy and Carbon Footprint Analysis of Distributed and Federated
Learning [42.37180749113699]
Classical and centralized Artificial Intelligence (AI) methods require moving data from producers (sensors, machines) to energy hungry data centers.
Emerging alternatives to mitigate such high energy costs propose to efficiently distribute, or federate, the learning tasks across devices.
This paper proposes a novel framework for the analysis of energy and carbon footprints in distributed and federated learning.
arXiv Detail & Related papers (2022-06-21T13:28:49Z) - Federated Learning over Wireless IoT Networks with Optimized
Communication and Resources [98.18365881575805]
Federated learning (FL) as a paradigm of collaborative learning techniques has obtained increasing research attention.
It is of interest to investigate fast responding and accurate FL schemes over wireless systems.
We show that the proposed communication-efficient federated learning framework converges at a strong linear rate.
arXiv Detail & Related papers (2021-10-22T13:25:57Z) - Blockchain Assisted Decentralized Federated Learning (BLADE-FL):
Performance Analysis and Resource Allocation [119.19061102064497]
We propose a decentralized FL framework by integrating blockchain into FL, namely, blockchain assisted decentralized federated learning (BLADE-FL)
In a round of the proposed BLADE-FL, each client broadcasts its trained model to other clients, competes to generate a block based on the received models, and then aggregates the models from the generated block before its local training of the next round.
We explore the impact of lazy clients on the learning performance of BLADE-FL, and characterize the relationship among the optimal K, the learning parameters, and the proportion of lazy clients.
arXiv Detail & Related papers (2021-01-18T07:19:08Z) - Wireless Communications for Collaborative Federated Learning [160.82696473996566]
Internet of Things (IoT) devices may not be able to transmit their collected data to a central controller for training machine learning models.
Google's seminal FL algorithm requires all devices to be directly connected with a central controller.
This paper introduces a novel FL framework, called collaborative FL (CFL), which enables edge devices to implement FL with less reliance on a central controller.
arXiv Detail & Related papers (2020-06-03T20:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.