Accuracy is not the only Metric that matters: Estimating the Energy
Consumption of Deep Learning Models
- URL: http://arxiv.org/abs/2304.00897v1
- Date: Mon, 3 Apr 2023 11:35:10 GMT
- Title: Accuracy is not the only Metric that matters: Estimating the Energy
Consumption of Deep Learning Models
- Authors: Johannes Getzner, Bertrand Charpentier, Stephan G\"unnemann
- Abstract summary: We have created an energy estimation pipeline1, which allows practitioners to estimate the energy needs of their models in advance, without actually running or training them.
We accomplished this, by collecting high-quality energy data and building a first baseline model, capable of predicting the energy consumption of DL models by accumulating their estimated layer-wise energies.
- Score: 33.45069308137142
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modern machine learning models have started to consume incredible amounts of
energy, thus incurring large carbon footprints (Strubell et al., 2019). To
address this issue, we have created an energy estimation pipeline1, which
allows practitioners to estimate the energy needs of their models in advance,
without actually running or training them. We accomplished this, by collecting
high-quality energy data and building a first baseline model, capable of
predicting the energy consumption of DL models by accumulating their estimated
layer-wise energies.
Related papers
- Just In Time Transformers [2.7350304370706797]
JITtrans is a novel transformer deep learning model that significantly improves energy consumption forecasting accuracy.
Our findings highlight the potential of advanced predictive technologies to revolutionize energy management and advance sustainable power systems.
arXiv Detail & Related papers (2024-10-22T10:33:00Z) - Power Hungry Processing: Watts Driving the Cost of AI Deployment? [74.19749699665216]
generative, multi-purpose AI systems promise a unified approach to building machine learning (ML) models into technology.
This ambition of generality'' comes at a steep cost to the environment, given the amount of energy these systems require and the amount of carbon that they emit.
We measure deployment cost as the amount of energy and carbon required to perform 1,000 inferences on representative benchmark dataset using these models.
We conclude with a discussion around the current trend of deploying multi-purpose generative ML systems, and caution that their utility should be more intentionally weighed against increased costs in terms of energy and emissions
arXiv Detail & Related papers (2023-11-28T15:09:36Z) - How to use model architecture and training environment to estimate the energy consumption of DL training [5.190998244098203]
This study aims to leverage the relationship between energy consumption and two relevant design decisions in Deep Learning training.
We study the training's power consumption behavior and propose four new energy estimation methods.
Our results show that selecting the proper model architecture and training environment can reduce energy consumption dramatically.
arXiv Detail & Related papers (2023-07-07T12:07:59Z) - Counting Carbon: A Survey of Factors Influencing the Emissions of
Machine Learning [77.62876532784759]
Machine learning (ML) requires using energy to carry out computations during the model training process.
The generation of this energy comes with an environmental cost in terms of greenhouse gas emissions, depending on quantity used and the energy source.
We present a survey of the carbon emissions of 95 ML models across time and different tasks in natural language processing and computer vision.
arXiv Detail & Related papers (2023-02-16T18:35:00Z) - Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language
Model [72.65502770895417]
We quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle.
We estimate that BLOOM's final training emitted approximately 24.7 tonnes ofcarboneqif we consider only the dynamic power consumption.
We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of machine learning models.
arXiv Detail & Related papers (2022-11-03T17:13:48Z) - Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision [31.781943982148025]
We present the first large-scale energy consumption benchmark for efficient computer vision models.
A new metric is proposed to explicitly evaluate the full-cycle energy consumption under different model usage intensity.
arXiv Detail & Related papers (2021-08-30T18:22:36Z) - EnergyVis: Interactively Tracking and Exploring Energy Consumption for
ML Models [8.939420322774243]
EnergyVis is an interactive energy consumption tracker for machine learning (ML) models.
It enables researchers to interactively track, visualize and compare model energy consumption across key energy consumption and carbon footprint metrics.
EnergyVis aims to raise awareness concerning computational sustainability by interactively highlighting excessive energy usage during model training.
arXiv Detail & Related papers (2021-03-30T15:33:43Z) - Towards Accurate and Reliable Energy Measurement of NLP Models [20.289537200662306]
We show that existing software-based energy measurements are not accurate because they do not take into account hardware differences and how resource utilization affects energy consumption.
We quantify the error of existing software-based energy measurements by using a hardware power meter that provides highly accurate energy measurements.
Our key takeaway is the need for a more accurate energy estimation model that takes into account hardware variabilities and the non-linear relationship between resource utilization and energy consumption.
arXiv Detail & Related papers (2020-10-11T13:44:52Z) - Multi-Agent Meta-Reinforcement Learning for Self-Powered and Sustainable
Edge Computing Systems [87.4519172058185]
An effective energy dispatch mechanism for self-powered wireless networks with edge computing capabilities is studied.
A novel multi-agent meta-reinforcement learning (MAMRL) framework is proposed to solve the formulated problem.
Experimental results show that the proposed MAMRL model can reduce up to 11% non-renewable energy usage and by 22.4% the energy cost.
arXiv Detail & Related papers (2020-02-20T04:58:07Z) - NeurOpt: Neural network based optimization for building energy
management and climate control [58.06411999767069]
We propose a data-driven control algorithm based on neural networks to reduce this cost of model identification.
We validate our learning and control algorithms on a two-story building with ten independently controlled zones, located in Italy.
arXiv Detail & Related papers (2020-01-22T00:51:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.