EnergyVis: Interactively Tracking and Exploring Energy Consumption for
ML Models
- URL: http://arxiv.org/abs/2103.16435v1
- Date: Tue, 30 Mar 2021 15:33:43 GMT
- Title: EnergyVis: Interactively Tracking and Exploring Energy Consumption for
ML Models
- Authors: Omar Shaikh, Jon Saad-Falcon, Austin P Wright, Nilaksh Das, Scott
Freitas, Omar Isaac Asensio, Duen Horng Chau
- Abstract summary: EnergyVis is an interactive energy consumption tracker for machine learning (ML) models.
It enables researchers to interactively track, visualize and compare model energy consumption across key energy consumption and carbon footprint metrics.
EnergyVis aims to raise awareness concerning computational sustainability by interactively highlighting excessive energy usage during model training.
- Score: 8.939420322774243
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The advent of larger machine learning (ML) models have improved
state-of-the-art (SOTA) performance in various modeling tasks, ranging from
computer vision to natural language. As ML models continue increasing in size,
so does their respective energy consumption and computational requirements.
However, the methods for tracking, reporting, and comparing energy consumption
remain limited. We presentEnergyVis, an interactive energy consumption tracker
for ML models. Consisting of multiple coordinated views, EnergyVis enables
researchers to interactively track, visualize and compare model energy
consumption across key energy consumption and carbon footprint metrics (kWh and
CO2), helping users explore alternative deployment locations and hardware that
may reduce carbon footprints. EnergyVis aims to raise awareness concerning
computational sustainability by interactively highlighting excessive energy
usage during model training; and by providing alternative training options to
reduce energy usage.
Related papers
- Computing Within Limits: An Empirical Study of Energy Consumption in ML Training and Inference [2.553456266022126]
Machine learning (ML) has seen tremendous advancements, but its environmental footprint remains a concern.
Acknowledging the growing environmental impact of ML this paper investigates Green ML.
arXiv Detail & Related papers (2024-06-20T13:59:34Z) - Power Hungry Processing: Watts Driving the Cost of AI Deployment? [74.19749699665216]
generative, multi-purpose AI systems promise a unified approach to building machine learning (ML) models into technology.
This ambition of generality'' comes at a steep cost to the environment, given the amount of energy these systems require and the amount of carbon that they emit.
We measure deployment cost as the amount of energy and carbon required to perform 1,000 inferences on representative benchmark dataset using these models.
We conclude with a discussion around the current trend of deploying multi-purpose generative ML systems, and caution that their utility should be more intentionally weighed against increased costs in terms of energy and emissions
arXiv Detail & Related papers (2023-11-28T15:09:36Z) - Enhancing Energy-Awareness in Deep Learning through Fine-Grained Energy
Measurement [11.37120215795946]
This paper introduces FECoM (Fine-grained Energy Consumption Meter), a framework for fine-grained Deep Learning energy consumption measurement.
FECoM addresses the challenges of measuring energy consumption at fine-grained level by using static instrumentation and considering various factors, including computational load stability and temperature.
arXiv Detail & Related papers (2023-08-23T17:32:06Z) - Scaling Vision-Language Models with Sparse Mixture of Experts [128.0882767889029]
We show that mixture-of-experts (MoE) techniques can achieve state-of-the-art performance on a range of benchmarks over dense models of equivalent computational cost.
Our research offers valuable insights into stabilizing the training of MoE models, understanding the impact of MoE on model interpretability, and balancing the trade-offs between compute performance when scaling vision-language models.
arXiv Detail & Related papers (2023-03-13T16:00:31Z) - Counting Carbon: A Survey of Factors Influencing the Emissions of
Machine Learning [77.62876532784759]
Machine learning (ML) requires using energy to carry out computations during the model training process.
The generation of this energy comes with an environmental cost in terms of greenhouse gas emissions, depending on quantity used and the energy source.
We present a survey of the carbon emissions of 95 ML models across time and different tasks in natural language processing and computer vision.
arXiv Detail & Related papers (2023-02-16T18:35:00Z) - Energy Transformer [64.22957136952725]
Our work combines aspects of three promising paradigms in machine learning, namely, attention mechanism, energy-based models, and associative memory.
We propose a novel architecture, called the Energy Transformer (or ET for short), that uses a sequence of attention layers that are purposely designed to minimize a specifically engineered energy function.
arXiv Detail & Related papers (2023-02-14T18:51:22Z) - Great Power, Great Responsibility: Recommendations for Reducing Energy
for Training Language Models [8.927248087602942]
We investigate techniques that can be used to reduce the energy consumption of common NLP applications.
These techniques can lead to significant reduction in energy consumption when training language models or their use for inference.
arXiv Detail & Related papers (2022-05-19T16:03:55Z) - Adaptive Energy Management for Self-Sustainable Wearables in Mobile
Health [21.97214707198675]
Small form factor of wearable devices limits the battery size and operating lifetime.
Energy harvesting has emerged as an effective method towards sustainable operation of wearable devices.
This paper studies the novel problem of adaptive energy management towards the goal of self-sustainable wearables by using harvested energy to supplement the battery energy and to reduce manual recharging by users.
arXiv Detail & Related papers (2022-01-16T23:49:20Z) - Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision [31.781943982148025]
We present the first large-scale energy consumption benchmark for efficient computer vision models.
A new metric is proposed to explicitly evaluate the full-cycle energy consumption under different model usage intensity.
arXiv Detail & Related papers (2021-08-30T18:22:36Z) - Multi-Agent Meta-Reinforcement Learning for Self-Powered and Sustainable
Edge Computing Systems [87.4519172058185]
An effective energy dispatch mechanism for self-powered wireless networks with edge computing capabilities is studied.
A novel multi-agent meta-reinforcement learning (MAMRL) framework is proposed to solve the formulated problem.
Experimental results show that the proposed MAMRL model can reduce up to 11% non-renewable energy usage and by 22.4% the energy cost.
arXiv Detail & Related papers (2020-02-20T04:58:07Z) - Towards the Systematic Reporting of the Energy and Carbon Footprints of
Machine Learning [68.37641996188133]
We introduce a framework for tracking realtime energy consumption and carbon emissions.
We create a leaderboard for energy efficient reinforcement learning algorithms.
We propose strategies for mitigation of carbon emissions and reduction of energy consumption.
arXiv Detail & Related papers (2020-01-31T05:12:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.