A first look into the carbon footprint of federated learning
- URL: http://arxiv.org/abs/2102.07627v6
- Date: Mon, 22 May 2023 15:07:08 GMT
- Title: A first look into the carbon footprint of federated learning
- Authors: Xinchi Qiu, Titouan Parcollet, Javier Fernandez-Marques, Pedro Porto
Buarque de Gusmao, Yan Gao, Daniel J. Beutel, Taner Topal, Akhil Mathur,
Nicholas D. Lane
- Abstract summary: This paper offers the first-ever systematic study of the carbon footprint of Federated Learning.
Depending on the configuration, FL can emit up to two order of magnitude more carbon than centralized machine learning.
- Score: 19.733846321425975
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Despite impressive results, deep learning-based technologies also raise
severe privacy and environmental concerns induced by the training procedure
often conducted in data centers. In response, alternatives to centralized
training such as Federated Learning (FL) have emerged. Perhaps unexpectedly, FL
is starting to be deployed at a global scale by companies that must adhere to
new legal demands and policies originating from governments and social groups
advocating for privacy protection. \textit{However, the potential environmental
impact related to FL remains unclear and unexplored. This paper offers the
first-ever systematic study of the carbon footprint of FL.} First, we propose a
rigorous model to quantify the carbon footprint, hence facilitating the
investigation of the relationship between FL design and carbon emissions. Then,
we compare the carbon footprint of FL to traditional centralized learning. Our
findings show that, depending on the configuration, FL can emit up to two order
of magnitude more carbon than centralized machine learning. However, in certain
settings, it can be comparable to centralized learning due to the reduced
energy consumption of embedded devices. We performed extensive experiments
across different types of datasets, settings and various deep learning models
with FL. Finally, we highlight and connect the reported results to the future
challenges and trends in FL to reduce its environmental impact, including
algorithms efficiency, hardware capabilities, and stronger industry
transparency.
Related papers
- TinyML NLP Approach for Semantic Wireless Sentiment Classification [49.801175302937246]
We introduce split learning (SL) as an energy-efficient alternative, privacy-preserving tiny machine learning (MLTiny) scheme.
Our results show that SL reduces processing power and CO2 emissions while maintaining high accuracy, whereas FL offers a balanced compromise between efficiency and privacy.
arXiv Detail & Related papers (2024-11-09T21:26:59Z) - A Carbon Tracking Model for Federated Learning: Impact of Quantization and Sparsification [5.341266334051207]
Federated Learning (FL) methods adopt efficient communication technologies to distribute machine learning tasks across edge devices.
This paper proposes a framework for real-time monitoring of the energy and carbon footprint impacts of FL systems.
arXiv Detail & Related papers (2023-10-12T07:20:03Z) - A Safe Genetic Algorithm Approach for Energy Efficient Federated
Learning in Wireless Communication Networks [53.561797148529664]
Federated Learning (FL) has emerged as a decentralized technique, where contrary to traditional centralized approaches, devices perform a model training in a collaborative manner.
Despite the existing efforts made in FL, its environmental impact is still under investigation, since several critical challenges regarding its applicability to wireless networks have been identified.
The current work proposes a Genetic Algorithm (GA) approach, targeting the minimization of both the overall energy consumption of an FL process and any unnecessary resource utilization.
arXiv Detail & Related papers (2023-06-25T13:10:38Z) - Green Federated Learning [7.003870178055125]
Federated Learning (FL) is a machine learning technique for training a centralized model using data of decentralized entities.
FL may leverage as many as hundreds of millions of globally distributed end-user devices with diverse energy sources.
We propose the concept of Green FL, which involves optimizing FL parameters and making design choices to minimize carbon emissions.
arXiv Detail & Related papers (2023-03-26T02:23:38Z) - Plankton-FL: Exploration of Federated Learning for Privacy-Preserving
Training of Deep Neural Networks for Phytoplankton Classification [81.04987357598802]
In this study, we explore the feasibility of leveraging federated learning for privacy-preserving training of deep neural networks for phytoplankton classification.
We simulate two different federated learning frameworks, federated learning (FL) and mutually exclusive FL (ME-FL)
Experimental results from this study demonstrate the feasibility and potential of federated learning for phytoplankton monitoring.
arXiv Detail & Related papers (2022-12-18T02:11:03Z) - Federated Learning with Privacy-Preserving Ensemble Attention
Distillation [63.39442596910485]
Federated Learning (FL) is a machine learning paradigm where many local nodes collaboratively train a central model while keeping the training data decentralized.
We propose a privacy-preserving FL framework leveraging unlabeled public data for one-way offline knowledge distillation.
Our technique uses decentralized and heterogeneous local data like existing FL approaches, but more importantly, it significantly reduces the risk of privacy leakage.
arXiv Detail & Related papers (2022-10-16T06:44:46Z) - Introducing Federated Learning into Internet of Things ecosystems --
preliminary considerations [0.31402652384742363]
Federated learning (FL) was proposed to facilitate the training of models in a distributed environment.
It supports the protection of (local) data privacy and uses local resources for model training.
arXiv Detail & Related papers (2022-07-15T18:48:57Z) - FedComm: Federated Learning as a Medium for Covert Communication [56.376997104843355]
Federated Learning (FL) is a solution to mitigate the privacy implications related to the adoption of deep learning.
This paper thoroughly investigates the communication capabilities of an FL scheme.
We introduce FedComm, a novel multi-system covert-communication technique.
arXiv Detail & Related papers (2022-01-21T17:05:56Z) - A Framework for Energy and Carbon Footprint Analysis of Distributed and
Federated Edge Learning [48.63610479916003]
This article breaks down and analyzes the main factors that influence the environmental footprint of distributed learning policies.
It models both vanilla and decentralized FL policies driven by consensus.
Results show that FL allows remarkable end-to-end energy savings (30%-40%) for wireless systems characterized by low bit/Joule efficiency.
arXiv Detail & Related papers (2021-03-18T16:04:42Z) - Can Federated Learning Save The Planet? [20.755849563134174]
This paper offers the first-ever systematic study of the carbon footprint of Federated Learning.
We propose a rigorous model to quantify the carbon footprint, hence facilitating the investigation of the relationship between FL design and carbon emissions.
Our findings show FL, despite being slower to converge, can be a greener technology than data center GPU.
arXiv Detail & Related papers (2020-10-13T16:45:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.