Green Multi-Objective Scheduling -- A memetic NSGA-III for flexible production with real-time energy cost and emissions
- URL: http://arxiv.org/abs/2405.14339v1
- Date: Thu, 23 May 2024 09:11:21 GMT
- Title: Green Multi-Objective Scheduling -- A memetic NSGA-III for flexible production with real-time energy cost and emissions
- Authors: Sascha C Burmeister,
- Abstract summary: This study focuses on industries adjusting production to real-time energy markets, offering flexible consumption to the grid.
We present a novel memetic NSGA-III to minimize makespan, energy cost, and emissions, integrating real energy market data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The use of renewable energies strengthens decarbonization strategies. To integrate volatile renewable sources, energy systems require grid expansion, storage capabilities, or flexible consumption. This study focuses on industries adjusting production to real-time energy markets, offering flexible consumption to the grid. Flexible production considers not only traditional goals like minimizing production time but also minimizing energy costs and emissions, thereby enhancing the sustainability of businesses. However, existing research focuses on single goals, neglects the combination of makespan, energy costs and emissions, or assumes constant or periodic tariffs instead of a dynamic energy market. We present a novel memetic NSGA-III to minimize makespan, energy cost, and emissions, integrating real energy market data, and allowing manufacturers to adapt consumption to current grid conditions. Evaluating it with benchmark instances from literature and real energy market data, we explore the trade-offs between objectives, showcasing potential savings in energy costs and emissions on estimated Pareto fronts.
Related papers
- Predicting Short Term Energy Demand in Smart Grid: A Deep Learning Approach for Integrating Renewable Energy Sources in Line with SDGs 7, 9, and 13 [0.0]
We propose a deep learning model for predicting energy demand in a smart power grid.
We use long short-term memory networks to capture complex patterns and dependencies in energy demand data.
The proposed model can accurately predict energy demand with a mean absolute error of 1.4%.
arXiv Detail & Related papers (2023-04-08T12:30:59Z) - Proximal Policy Optimization Based Reinforcement Learning for Joint
Bidding in Energy and Frequency Regulation Markets [6.175137568373435]
Energy arbitrage can be a significant source of revenue for the battery energy storage system (BESS)
It is crucial for the BESS to carefully decide how much capacity to assign to each market to maximize the total profit under uncertain market conditions.
This paper formulates the bidding problem of the BESS as a Markov Decision Process, which enables the BESS to participate in both the spot market and the FCAS market to maximize profit.
arXiv Detail & Related papers (2022-12-13T13:07:31Z) - Distributed Energy Management and Demand Response in Smart Grids: A
Multi-Agent Deep Reinforcement Learning Framework [53.97223237572147]
This paper presents a multi-agent Deep Reinforcement Learning (DRL) framework for autonomous control and integration of renewable energy resources into smart power grid systems.
In particular, the proposed framework jointly considers demand response (DR) and distributed energy management (DEM) for residential end-users.
arXiv Detail & Related papers (2022-11-29T01:18:58Z) - Battery and Hydrogen Energy Storage Control in a Smart Energy Network
with Flexible Energy Demand using Deep Reinforcement Learning [2.5666730153464465]
We introduce a hybrid energy storage system composed of battery and hydrogen energy storage.
We propose a deep reinforcement learning-based control strategy to optimise the scheduling of the hybrid energy storage system and energy demand in real-time.
arXiv Detail & Related papers (2022-08-26T16:47:48Z) - Sustainability using Renewable Electricity (SuRE) towards NetZero
Emissions [0.0]
Growth in energy demand poses serious threat to the environment.
Most of the energy sources are non-renewable and based on fossil fuels, which leads to emission of harmful greenhouse gases.
We present a scalable AI based solution that can be used by organizations to increase their overall renewable electricity share in total energy consumption.
arXiv Detail & Related papers (2022-02-26T10:04:26Z) - Modelling the transition to a low-carbon energy supply [91.3755431537592]
A transition to a low-carbon electricity supply is crucial to limit the impacts of climate change.
Reducing carbon emissions could help prevent the world from reaching a tipping point, where runaway emissions are likely.
Runaway emissions could lead to extremes in weather conditions around the world.
arXiv Detail & Related papers (2021-09-25T12:37:05Z) - ECO: Enabling Energy-Neutral IoT Devices through Runtime Allocation of
Harvested Energy [0.8774604259603302]
We present a runtime-based energy-allocation framework to optimize the utility of the target device under energy constraints.
The proposed framework uses an efficient iterative algorithm to compute initial energy allocations at the beginning of a day.
We evaluate this framework using solar and motion energy harvesting modalities and American Time Use Survey data from 4772 different users.
arXiv Detail & Related papers (2021-02-26T17:21:25Z) - A Multi-Agent Deep Reinforcement Learning Approach for a Distributed
Energy Marketplace in Smart Grids [58.666456917115056]
This paper presents a Reinforcement Learning based energy market for a prosumer dominated microgrid.
The proposed market model facilitates a real-time and demanddependent dynamic pricing environment, which reduces grid costs and improves the economic benefits for prosumers.
arXiv Detail & Related papers (2020-09-23T02:17:51Z) - A Deep Reinforcement Learning Framework for Continuous Intraday Market
Bidding [69.37299910149981]
A key component for the successful renewable energy sources integration is the usage of energy storage.
We propose a novel modelling framework for the strategic participation of energy storage in the European continuous intraday market.
An distributed version of the fitted Q algorithm is chosen for solving this problem due to its sample efficiency.
Results indicate that the agent converges to a policy that achieves in average higher total revenues than the benchmark strategy.
arXiv Detail & Related papers (2020-04-13T13:50:13Z) - Risk-Aware Energy Scheduling for Edge Computing with Microgrid: A
Multi-Agent Deep Reinforcement Learning Approach [82.6692222294594]
We study a risk-aware energy scheduling problem for a microgrid-powered MEC network.
We derive the solution by applying a multi-agent deep reinforcement learning (MADRL)-based advantage actor-critic (A3C) algorithm with shared neural networks.
arXiv Detail & Related papers (2020-02-21T02:14:38Z) - Multi-Agent Meta-Reinforcement Learning for Self-Powered and Sustainable
Edge Computing Systems [87.4519172058185]
An effective energy dispatch mechanism for self-powered wireless networks with edge computing capabilities is studied.
A novel multi-agent meta-reinforcement learning (MAMRL) framework is proposed to solve the formulated problem.
Experimental results show that the proposed MAMRL model can reduce up to 11% non-renewable energy usage and by 22.4% the energy cost.
arXiv Detail & Related papers (2020-02-20T04:58:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.