Carbontracker: Tracking and Predicting the Carbon Footprint of Training
Deep Learning Models
- URL: http://arxiv.org/abs/2007.03051v1
- Date: Mon, 6 Jul 2020 20:24:31 GMT
- Title: Carbontracker: Tracking and Predicting the Carbon Footprint of Training
Deep Learning Models
- Authors: Lasse F. Wolff Anthony, Benjamin Kanding, Raghavendra Selvan
- Abstract summary: Machine learning (ML) may become a significant contributor to climate change if this exponential trend continues.
We propose that energy and carbon footprint of model development and training is reported alongside performance metrics using tools like Carbontracker.
- Score: 0.3441021278275805
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning (DL) can achieve impressive results across a wide variety of
tasks, but this often comes at the cost of training models for extensive
periods on specialized hardware accelerators. This energy-intensive workload
has seen immense growth in recent years. Machine learning (ML) may become a
significant contributor to climate change if this exponential trend continues.
If practitioners are aware of their energy and carbon footprint, then they may
actively take steps to reduce it whenever possible. In this work, we present
Carbontracker, a tool for tracking and predicting the energy and carbon
footprint of training DL models. We propose that energy and carbon footprint of
model development and training is reported alongside performance metrics using
tools like Carbontracker. We hope this will promote responsible computing in ML
and encourage research into energy-efficient deep neural networks.
Related papers
- ssProp: Energy-Efficient Training for Convolutional Neural Networks with Scheduled Sparse Back Propagation [4.77407121905745]
Back-propagation (BP) is a major source of computational expense during training deep learning models.
We propose a general, energy-efficient convolution module that can be seamlessly integrated into any deep learning architecture.
arXiv Detail & Related papers (2024-08-22T17:22:59Z) - IoTCO2: Assessing the End-To-End Carbon Footprint of Internet-of-Things-Enabled Deep Learning [6.582643137531881]
Deep learning (DL) models are increasingly deployed on Internet of Things (IoT) devices for data processing.
carb is an end-to-end tool for precise carbon footprint estimation in IoT-enabled DL.
arXiv Detail & Related papers (2024-03-16T17:32:59Z) - Chasing Low-Carbon Electricity for Practical and Sustainable DNN
Training [4.0441558412180365]
We present a solution that reduces the carbon footprint of training without migrating or postponing jobs.
Specifically, our solution observes real-time carbon intensity shifts during training and controls the energy consumption of GPU.
In order to proactively adapt to shifting carbon intensity, we propose a lightweight machine learning algorithm.
arXiv Detail & Related papers (2023-03-04T21:33:29Z) - Counting Carbon: A Survey of Factors Influencing the Emissions of
Machine Learning [77.62876532784759]
Machine learning (ML) requires using energy to carry out computations during the model training process.
The generation of this energy comes with an environmental cost in terms of greenhouse gas emissions, depending on quantity used and the energy source.
We present a survey of the carbon emissions of 95 ML models across time and different tasks in natural language processing and computer vision.
arXiv Detail & Related papers (2023-02-16T18:35:00Z) - PhAST: Physics-Aware, Scalable, and Task-specific GNNs for Accelerated
Catalyst Design [102.9593507372373]
Catalyst materials play a crucial role in the electrochemical reactions involved in industrial processes.
Machine learning holds the potential to efficiently model materials properties from large amounts of data.
We propose task-specific innovations applicable to most architectures, enhancing both computational efficiency and accuracy.
arXiv Detail & Related papers (2022-11-22T05:24:30Z) - Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language
Model [72.65502770895417]
We quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle.
We estimate that BLOOM's final training emitted approximately 24.7 tonnes ofcarboneqif we consider only the dynamic power consumption.
We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of machine learning models.
arXiv Detail & Related papers (2022-11-03T17:13:48Z) - Eco2AI: carbon emissions tracking of machine learning models as the
first step towards sustainable AI [47.130004596434816]
In eco2AI we put emphasis on accuracy of energy consumption tracking and correct regional CO2 emissions accounting.
The motivation also comes from the concept of AI-based green house gases sequestrating cycle with both Sustainable AI and Green AI pathways.
arXiv Detail & Related papers (2022-07-31T09:34:53Z) - Measuring the Carbon Intensity of AI in Cloud Instances [91.28501520271972]
We provide a framework for measuring software carbon intensity, and propose to measure operational carbon emissions.
We evaluate a suite of approaches for reducing emissions on the Microsoft Azure cloud compute platform.
arXiv Detail & Related papers (2022-06-10T17:04:04Z) - Carbon Emissions and Large Neural Network Training [19.233899715628073]
We calculate the energy use and carbon footprint of several recent large models-T5, Meena, GShard, Switch Transformer, and GPT-3.
We highlight the following opportunities to improve energy efficiency and CO2 equivalent emissions (CO2e)
To help reduce the carbon footprint of ML, we believe energy usage and CO2e should be a key metric in evaluating models.
arXiv Detail & Related papers (2021-04-21T04:44:25Z) - Towards the Systematic Reporting of the Energy and Carbon Footprints of
Machine Learning [68.37641996188133]
We introduce a framework for tracking realtime energy consumption and carbon emissions.
We create a leaderboard for energy efficient reinforcement learning algorithms.
We propose strategies for mitigation of carbon emissions and reduction of energy consumption.
arXiv Detail & Related papers (2020-01-31T05:12:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.