CarbonX: An Open-Source Tool for Computational Decarbonization Using Time Series Foundation Models
- URL: http://arxiv.org/abs/2510.01521v2
- Date: Fri, 10 Oct 2025 17:49:40 GMT
- Title: CarbonX: An Open-Source Tool for Computational Decarbonization Using Time Series Foundation Models
- Authors: Diptyaroop Maji, Kang Yang, Prashant Shenoy, Ramesh K Sitaraman, Mani Srivastava,
- Abstract summary: CarbonX is an open-source tool for carbon intensity forecasting and imputation.<n>It delivers a zero-shot forecasting Mean Absolute Percentage Error (MAPE) of 15.82% across 214 grids worldwide.<n>Across 13 benchmark grids, CarbonX performance is comparable with the current state-of-the-art.
- Score: 5.5014759378304605
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Computational decarbonization aims to reduce carbon emissions in computing and societal systems such as data centers, transportation, and built environments. This requires accurate, fine-grained carbon intensity forecasts, yet existing tools have several key limitations: (i) they require grid-specific electricity mix data, restricting use where such information is unavailable; (ii) they depend on separate grid-specific models that make it challenging to provide global coverage; and (iii) they provide forecasts without uncertainty estimates, limiting reliability for downstream carbon-aware applications. In this paper, we present CarbonX, an open-source tool that leverages Time Series Foundation Models (TSFMs) for a range of decarbonization tasks. CarbonX utilizes the versatility of TSFMs to provide strong performance across multiple tasks, such as carbon intensity forecasting and imputation, and across diverse grids. Using only historical carbon intensity data and a single general model, our tool achieves a zero-shot forecasting Mean Absolute Percentage Error (MAPE) of 15.82% across 214 grids worldwide. Across 13 benchmark grids, CarbonX performance is comparable with the current state-of-the-art, with an average MAPE of 9.59% and tail forecasting MAPE of 16.54%, while also providing prediction intervals with 95% coverage. CarbonX can provide forecasts for up to 21 days with minimal accuracy degradation. Further, when fully fine-tuned, CarbonX outperforms the statistical baselines by 1.2--3.9X on the imputation task. Overall, these results demonstrate that CarbonX can be used easily on any grid with limited data and still deliver strong performance, making it a practical tool for global-scale decarbonization.
Related papers
- Green Federated Learning via Carbon-Aware Client and Time Slot Scheduling [8.491852197957849]
Training large-scale machine learning models incurs substantial carbon emissions.<n>This paper investigates how to reduce emissions in Federated Learning through carbon-aware client selection and training scheduling.
arXiv Detail & Related papers (2025-09-10T20:24:26Z) - CarboFormer: A Lightweight Semantic Segmentation Architecture for Efficient Carbon Dioxide Detection Using Optical Gas Imaging [4.567122178196833]
Carbon dioxide (CO$$) emissions are critical indicators of both environmental impact and industrial processes.<n>We introduce CarboFormer, a lightweight semantic segmentation framework for Optical Gas Imaging (OGI)<n>Our approach integrates an optimized encoder-decoder architecture with specialized multi-scale feature fusion and auxiliary supervision strategies.
arXiv Detail & Related papers (2025-05-23T18:01:42Z) - Carbon- and Precedence-Aware Scheduling for Data Processing Clusters [10.676357280358886]
We show that carbon-aware scheduling for data processing benefits from knowledge of both time-varying carbon and precedence constraints.<n>Our schedulers enable a priority between carbon reduction and job completion time, and we give analytical results characterizing the trade-off between the two.<n>Our Spark prototype on a 100-node cluster shows that a moderate configuration of $texttPCAPS$ reduces carbon footprint up to 32.9% without significantly impacting the cluster's total efficiency.
arXiv Detail & Related papers (2025-02-13T19:06:10Z) - CarbonSense: A Multimodal Dataset and Baseline for Carbon Flux Modelling [9.05128569357374]
We present CarbonSense, the first machine learning-ready dataset for data-driven carbon flux modelling.<n>Our experiments illustrate the potential gains that multimodal deep learning techniques can bring to this domain.
arXiv Detail & Related papers (2024-06-07T13:47:40Z) - Generative AI for Low-Carbon Artificial Intelligence of Things with Large Language Models [67.0243099823109]
Generative AI (GAI) holds immense potential to reduce carbon emissions of Artificial Intelligence of Things (AIoT)
In this article, we explore the potential of GAI for carbon emissions reduction and propose a novel GAI-enabled solution for low-carbon AIoT.
We propose a Large Language Model (LLM)-enabled carbon emission optimization framework, in which we design pluggable LLM and Retrieval Augmented Generation (RAG) modules.
arXiv Detail & Related papers (2024-04-28T05:46:28Z) - LACS: Learning-Augmented Algorithms for Carbon-Aware Resource Scaling with Uncertain Demand [1.423958951481749]
This paper studies the online carbon-aware resource scaling problem with unknown job lengths (OCSU)
We propose LACS, a theoretically robust learning-augmented algorithm that solves OCSU.
LACS achieves a 32% reduction in carbon footprint compared to the deadline-aware carbon-agnostic execution of the job.
arXiv Detail & Related papers (2024-03-29T04:54:22Z) - Machine Guided Discovery of Novel Carbon Capture Solvents [48.7576911714538]
Machine learning offers a promising method for reducing the time and resource burdens of materials development.
We have developed an end-to-end "discovery cycle" to select new aqueous amines compatible with the commercially viable acid gas scrubbing carbon capture.
The prediction process shows 60% accuracy against experiment for both material parameters and 80% for a single parameter on an external test set.
arXiv Detail & Related papers (2023-03-24T18:32:38Z) - Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language
Model [72.65502770895417]
We quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle.
We estimate that BLOOM's final training emitted approximately 24.7 tonnes ofcarboneqif we consider only the dynamic power consumption.
We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of machine learning models.
arXiv Detail & Related papers (2022-11-03T17:13:48Z) - Real-time high-resolution CO$_2$ geological storage prediction using
nested Fourier neural operators [58.728312684306545]
Carbon capture and storage (CCS) plays an essential role in global decarbonization.
Scaling up CCS deployment requires accurate and high-resolution modeling of the storage reservoir pressure buildup and the gaseous plume migration.
We introduce Nested Fourier Neural Operator (FNO), a machine-learning framework for high-resolution dynamic 3D CO2 storage modeling at a basin scale.
arXiv Detail & Related papers (2022-10-31T04:04:03Z) - Measuring the Carbon Intensity of AI in Cloud Instances [91.28501520271972]
We provide a framework for measuring software carbon intensity, and propose to measure operational carbon emissions.
We evaluate a suite of approaches for reducing emissions on the Microsoft Azure cloud compute platform.
arXiv Detail & Related papers (2022-06-10T17:04:04Z) - Optimizing carbon tax for decentralized electricity markets using an
agent-based model [69.3939291118954]
Averting the effects of anthropogenic climate change requires a transition from fossil fuels to low-carbon technology.
Carbon taxes have been shown to be an efficient way to aid in this transition.
We use the NSGA-II genetic algorithm to minimize average electricity price and relative carbon intensity of the electricity mix.
arXiv Detail & Related papers (2020-05-28T06:54:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.