RARE: Renewable Energy Aware Resource Management in Datacenters
- URL: http://arxiv.org/abs/2211.05346v1
- Date: Thu, 10 Nov 2022 05:17:14 GMT
- Title: RARE: Renewable Energy Aware Resource Management in Datacenters
- Authors: Vanamala Venkataswamy, Jake Grigsby, Andrew Grimshaw, Yanjun Qi
- Abstract summary: Hyperscale cloud providers have announced plans to power their datacenters using renewable energy.
Integrating renewables to power the datacenters is challenging because the power generation is intermittent.
We present a scheduler that learns effective job scheduling policies while continually adapting to the intermittent power supply from renewables.
- Score: 9.488752723308954
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The exponential growth in demand for digital services drives massive
datacenter energy consumption and negative environmental impacts. Promoting
sustainable solutions to pressing energy and digital infrastructure challenges
is crucial. Several hyperscale cloud providers have announced plans to power
their datacenters using renewable energy. However, integrating renewables to
power the datacenters is challenging because the power generation is
intermittent, necessitating approaches to tackle power supply variability. Hand
engineering domain-specific heuristics-based schedulers to meet specific
objective functions in such complex dynamic green datacenter environments is
time-consuming, expensive, and requires extensive tuning by domain experts. The
green datacenters need smart systems and system software to employ multiple
renewable energy sources (wind and solar) by intelligently adapting computing
to renewable energy generation. We present RARE (Renewable energy Aware
REsource management), a Deep Reinforcement Learning (DRL) job scheduler that
automatically learns effective job scheduling policies while continually
adapting to datacenters' complex dynamic environment. The resulting DRL
scheduler performs better than heuristic scheduling policies with different
workloads and adapts to the intermittent power supply from renewables. We
demonstrate DRL scheduler system design parameters that, when tuned correctly,
produce better performance. Finally, we demonstrate that the DRL scheduler can
learn from and improve upon existing heuristic policies using Offline Learning.
Related papers
- D5RL: Diverse Datasets for Data-Driven Deep Reinforcement Learning [99.33607114541861]
We propose a new benchmark for offline RL that focuses on realistic simulations of robotic manipulation and locomotion environments.
Our proposed benchmark covers state-based and image-based domains, and supports both offline RL and online fine-tuning evaluation.
arXiv Detail & Related papers (2024-08-15T22:27:00Z) - Spatio-temporal load shifting for truly clean computing [0.5857582826810999]
We study the impact of shifting computing jobs and associated power loads both in time and between locations.
We isolate three signals relevant for informed use of loadblity.
The costs of 24/7 CFE are reduced by 1.29$pm$0.07 EUR/MWh for every additional percentage of flexible load.
arXiv Detail & Related papers (2024-03-26T13:36:42Z) - Sustainable AIGC Workload Scheduling of Geo-Distributed Data Centers: A
Multi-Agent Reinforcement Learning Approach [48.18355658448509]
Recent breakthroughs in generative artificial intelligence have triggered a surge in demand for machine learning training, which poses significant cost burdens and environmental challenges due to its substantial energy consumption.
Scheduling training jobs among geographically distributed cloud data centers unveils the opportunity to optimize the usage of computing capacity powered by inexpensive and low-carbon energy.
We propose an algorithm based on multi-agent reinforcement learning and actor-critic methods to learn the optimal collaborative scheduling strategy through interacting with a cloud system built with real-life workload patterns, energy prices, and carbon intensities.
arXiv Detail & Related papers (2023-04-17T02:12:30Z) - Distributed Energy Management and Demand Response in Smart Grids: A
Multi-Agent Deep Reinforcement Learning Framework [53.97223237572147]
This paper presents a multi-agent Deep Reinforcement Learning (DRL) framework for autonomous control and integration of renewable energy resources into smart power grid systems.
In particular, the proposed framework jointly considers demand response (DR) and distributed energy management (DEM) for residential end-users.
arXiv Detail & Related papers (2022-11-29T01:18:58Z) - Job Scheduling in Datacenters using Constraint Controlled RL [0.0]
We apply Proportional-Integral-Derivative (PID) Lagrangian methods in Deep Reinforcement Learning to job scheduling problem in the green datacenter environment.
Experiments demonstrate improved performance compared to scheduling policies without the PID Lagrangian methods.
arXiv Detail & Related papers (2022-11-10T04:43:14Z) - HUNTER: AI based Holistic Resource Management for Sustainable Cloud
Computing [26.48962351761643]
We propose an artificial intelligence (AI) based holistic resource management technique for sustainable cloud computing called HUNTER.
The proposed model formulates the goal of optimizing energy efficiency in data centers as a multi-objective scheduling problem.
Experiments on simulated and physical cloud environments show that HUNTER outperforms state-of-the-art baselines in terms of energy consumption, SLA violation, scheduling time, cost and temperature by up to 12, 35, 43, 54 and 3 percent respectively.
arXiv Detail & Related papers (2021-10-11T18:11:26Z) - Deep Reinforcement Learning Based Multidimensional Resource Management
for Energy Harvesting Cognitive NOMA Communications [64.1076645382049]
Combination of energy harvesting (EH), cognitive radio (CR), and non-orthogonal multiple access (NOMA) is a promising solution to improve energy efficiency.
In this paper, we study the spectrum, energy, and time resource management for deterministic-CR-NOMA IoT systems.
arXiv Detail & Related papers (2021-09-17T08:55:48Z) - Power Modeling for Effective Datacenter Planning and Compute Management [53.41102502425513]
We discuss two classes of statistical power models designed and validated to be accurate, simple, interpretable and applicable to all hardware configurations and workloads.
We demonstrate that the proposed statistical modeling techniques, while simple and scalable, predict power with less than 5% Mean Absolute Percent Error (MAPE) for more than 95% diverse Power Distribution Units (more than 2000) using only 4 features.
arXiv Detail & Related papers (2021-03-22T21:22:51Z) - A robust modeling framework for energy analysis of data centers [0.0]
Data centers are energy-intensive with significant and growing electricity demand.
Current models fail to provide consistent and high dimensional energy analysis for data centers.
This research aims to provide policy makers and data center energy analysts with comprehensive understanding of data center energy use and efficiency opportunities.
arXiv Detail & Related papers (2020-06-11T21:05:20Z) - Artificial Intelligence (AI)-Centric Management of Resources in Modern
Distributed Computing Systems [22.550075095184514]
Cloud Data Centres (DCS) are large scale, complex, heterogeneous, and distributed across multiple networks and geographical boundaries.
The Internet of Things (IoT)-driven applications are producing a huge amount of data that requires real-time processing and fast response.
Existing Resource Management Systems (RMS) rely on either static or solutions inadequate for such composite and dynamic systems.
arXiv Detail & Related papers (2020-06-09T06:54:07Z) - Multi-Agent Meta-Reinforcement Learning for Self-Powered and Sustainable
Edge Computing Systems [87.4519172058185]
An effective energy dispatch mechanism for self-powered wireless networks with edge computing capabilities is studied.
A novel multi-agent meta-reinforcement learning (MAMRL) framework is proposed to solve the formulated problem.
Experimental results show that the proposed MAMRL model can reduce up to 11% non-renewable energy usage and by 22.4% the energy cost.
arXiv Detail & Related papers (2020-02-20T04:58:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.