Green Federated Learning via Carbon-Aware Client and Time Slot Scheduling
- URL: http://arxiv.org/abs/2509.08980v1
- Date: Wed, 10 Sep 2025 20:24:26 GMT
- Title: Green Federated Learning via Carbon-Aware Client and Time Slot Scheduling
- Authors: Daniel Richards Arputharaj, Charlotte Rodriguez, Angelo Rodio, Giovanni Neglia,
- Abstract summary: Training large-scale machine learning models incurs substantial carbon emissions.<n>This paper investigates how to reduce emissions in Federated Learning through carbon-aware client selection and training scheduling.
- Score: 8.491852197957849
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Training large-scale machine learning models incurs substantial carbon emissions. Federated Learning (FL), by distributing computation across geographically dispersed clients, offers a natural framework to leverage regional and temporal variations in Carbon Intensity (CI). This paper investigates how to reduce emissions in FL through carbon-aware client selection and training scheduling. We first quantify the emission savings of a carbon-aware scheduling policy that leverages slack time -- permitting a modest extension of the training duration so that clients can defer local training rounds to lower-carbon periods. We then examine the performance trade-offs of such scheduling which stem from statistical heterogeneity among clients, selection bias in participation, and temporal correlation in model updates. To leverage these trade-offs, we construct a carbon-aware scheduler that integrates slack time, $\alpha$-fair carbon allocation, and a global fine-tuning phase. Experiments on real-world CI data show that our scheduler outperforms slack-agnostic baselines, achieving higher model accuracy across a wide range of carbon budgets, with especially strong gains under tight carbon constraints.
Related papers
- Noise-aware Client Selection for carbon-efficient Federated Learning via Gradient Norm Thresholding [1.3585661787562995]
We introduce a modular approach on top to state-of-the-art client selection strategies for carbon-efficient Federated Learning.<n>Our method enhances robustness by incorporating a noisy client data filtering, improving both model performance and sustainability.
arXiv Detail & Related papers (2026-03-04T15:43:48Z) - CO-PFL: Contribution-Oriented Personalized Federated Learning for Heterogeneous Networks [51.43780477302533]
Contribution-Oriented PFL (CO-PFL) is a novel algorithm that dynamically estimates each client's contribution for global aggregation.<n>CO-PFL consistently surpasses state-of-the-art methods in robustness in personalization accuracy, robustness, scalability and convergence stability.
arXiv Detail & Related papers (2025-10-23T05:10:06Z) - Decentralized Dynamic Cooperation of Personalized Models for Federated Continual Learning [50.56947843548702]
We propose a decentralized dynamic cooperation framework for Federated continual learning.<n>Clients establish dynamic cooperative learning coalitions to balance the acquisition of new knowledge and the retention of prior learning.<n>We also propose a merge-blocking algorithm and a dynamic cooperative evolution algorithm to achieve cooperative and dynamic equilibrium.
arXiv Detail & Related papers (2025-09-28T06:53:23Z) - Diffusion-Modeled Reinforcement Learning for Carbon and Risk-Aware Microgrid Optimization [48.70916202664808]
DiffCarl is a diffusion-modeled carbon- and risk-aware reinforcement learning algorithm for intelligent operation of multi-microgrid systems.<n>It outperforms classic algorithms and state-of-the-art DRL solutions, with 2.3-30.1% lower operational cost.<n>It also achieves 28.7% lower carbon emissions than those of its carbon-unaware variant and reduces performance variability.
arXiv Detail & Related papers (2025-07-22T03:27:07Z) - Carbon- and Precedence-Aware Scheduling for Data Processing Clusters [10.676357280358886]
We show that carbon-aware scheduling for data processing benefits from knowledge of both time-varying carbon and precedence constraints.<n>Our schedulers enable a priority between carbon reduction and job completion time, and we give analytical results characterizing the trade-off between the two.<n>Our Spark prototype on a 100-node cluster shows that a moderate configuration of $texttPCAPS$ reduces carbon footprint up to 32.9% without significantly impacting the cluster's total efficiency.
arXiv Detail & Related papers (2025-02-13T19:06:10Z) - Carbon Market Simulation with Adaptive Mechanism Design [55.25103894620696]
A carbon market is a market-based tool that incentivizes economic agents to align individual profits with the global utility.
We propose an adaptive mechanism design framework, simulating the market using hierarchical, model-free multi-agent reinforcement learning (MARL)
Numerical results show MARL enables government agents to balance productivity, equality, and carbon emissions.
arXiv Detail & Related papers (2024-06-12T05:08:51Z) - FedGreen: Carbon-aware Federated Learning with Model Size Adaptation [36.283273000969636]
Federated learning (FL) provides a promising collaborative framework to build a model from distributed clients.
Cloud and edge servers hosting FL clients may exhibit diverse carbon footprints influenced by their geographical locations with varying power sources.
We propose FedGreen, a carbon-aware FL approach to efficiently train models by adopting adaptive model sizes shared with clients.
arXiv Detail & Related papers (2024-04-23T20:37:26Z) - CAFE: Carbon-Aware Federated Learning in Geographically Distributed Data
Centers [18.54380015603228]
Training large-scale artificial intelligence (AI) models demands significant computational power and energy, leading to increased carbon footprint with potential environmental repercussions.
This paper delves into the challenges of training AI models across geographically distributed (geo-distributed) data centers, emphasizing the balance between learning performance and carbon footprint.
We propose a new framework called CAFE (short for Carbon-Aware Federated Learning) to optimize training within a fixed carbon footprint budget.
arXiv Detail & Related papers (2023-11-06T23:59:22Z) - EcoLearn: Optimizing the Carbon Footprint of Federated Learning [1.4257277178729617]
Federated Learning (FL) distributes machine learning (ML) training across edge devices to reduce data transfer overhead and protect data privacy.<n>FL model training may span hundreds of devices and is thus resource- and energy-intensive.<n>We design EcoLearn, which minimizes FL's carbon footprint without significantly affecting model accuracy or training time.
arXiv Detail & Related papers (2023-10-27T08:37:10Z) - Real-time high-resolution CO$_2$ geological storage prediction using
nested Fourier neural operators [58.728312684306545]
Carbon capture and storage (CCS) plays an essential role in global decarbonization.
Scaling up CCS deployment requires accurate and high-resolution modeling of the storage reservoir pressure buildup and the gaseous plume migration.
We introduce Nested Fourier Neural Operator (FNO), a machine-learning framework for high-resolution dynamic 3D CO2 storage modeling at a basin scale.
arXiv Detail & Related papers (2022-10-31T04:04:03Z) - Measuring the Carbon Intensity of AI in Cloud Instances [91.28501520271972]
We provide a framework for measuring software carbon intensity, and propose to measure operational carbon emissions.
We evaluate a suite of approaches for reducing emissions on the Microsoft Azure cloud compute platform.
arXiv Detail & Related papers (2022-06-10T17:04:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.