A Memetic NSGA-III for Green Flexible Production with Real-Time Energy Costs & Emissions
- URL: http://arxiv.org/abs/2405.14339v2
- Date: Thu, 06 Feb 2025 10:10:27 GMT
- Title: A Memetic NSGA-III for Green Flexible Production with Real-Time Energy Costs & Emissions
- Authors: Sascha C Burmeister,
- Abstract summary: This study focuses on industries that adapt production to real-time energy markets, offering flexible consumption to the grid.
We present a novel memetic NSGA-III to minimize makespan, energy cost, and emissions, integrating real energy market data.
- Score: 0.0
- License:
- Abstract: The use of renewable energies strengthens decarbonization strategies. To integrate volatile renewable sources, energy systems require grid expansion, storage capabilities, or flexible consumption. This study focuses on industries that adapt production to real-time energy markets, offering flexible consumption to the grid. Flexible production considers not only traditional goals like minimizing production time, but also minimizing energy costs and emissions, thereby enhancing the sustainability of businesses. However, existing research focuses on single goals, neglects the combination of makespan, energy costs, and emissions, or assumes constant or periodic tariffs instead of a dynamic energy market. We present a novel memetic NSGA-III to minimize makespan, energy cost, and emissions, integrating real energy market data, and allowing manufacturers to adapt energy consumption to current grid conditions. Evaluating it with benchmark instances from literature and real energy market data, we explore the trade-offs between objectives, showcasing potential savings in energy costs and emissions on estimated Pareto fronts.
Related papers
- A Machine Learning Framework to Deconstruct the Primary Drivers for
Electricity Market Price Events [0.8192907805418581]
Power grids are moving towards 100% renewable energy source bulk power grids.
Traditional root cause analysis and statistical approaches are rendered inapplicable to analyze and infer the main drivers behind price formation.
We propose a machine learning-based analysis framework to deconstruct the primary drivers for price spike events in modern electricity markets with high renewable energy.
arXiv Detail & Related papers (2023-09-12T09:24:21Z) - Predicting Short Term Energy Demand in Smart Grid: A Deep Learning Approach for Integrating Renewable Energy Sources in Line with SDGs 7, 9, and 13 [0.0]
We propose a deep learning model for predicting energy demand in a smart power grid.
We use long short-term memory networks to capture complex patterns and dependencies in energy demand data.
The proposed model can accurately predict energy demand with a mean absolute error of 1.4%.
arXiv Detail & Related papers (2023-04-08T12:30:59Z) - Proximal Policy Optimization Based Reinforcement Learning for Joint
Bidding in Energy and Frequency Regulation Markets [6.175137568373435]
Energy arbitrage can be a significant source of revenue for the battery energy storage system (BESS)
It is crucial for the BESS to carefully decide how much capacity to assign to each market to maximize the total profit under uncertain market conditions.
This paper formulates the bidding problem of the BESS as a Markov Decision Process, which enables the BESS to participate in both the spot market and the FCAS market to maximize profit.
arXiv Detail & Related papers (2022-12-13T13:07:31Z) - Distributed Energy Management and Demand Response in Smart Grids: A
Multi-Agent Deep Reinforcement Learning Framework [53.97223237572147]
This paper presents a multi-agent Deep Reinforcement Learning (DRL) framework for autonomous control and integration of renewable energy resources into smart power grid systems.
In particular, the proposed framework jointly considers demand response (DR) and distributed energy management (DEM) for residential end-users.
arXiv Detail & Related papers (2022-11-29T01:18:58Z) - Battery and Hydrogen Energy Storage Control in a Smart Energy Network
with Flexible Energy Demand using Deep Reinforcement Learning [2.5666730153464465]
We introduce a hybrid energy storage system composed of battery and hydrogen energy storage.
We propose a deep reinforcement learning-based control strategy to optimise the scheduling of the hybrid energy storage system and energy demand in real-time.
arXiv Detail & Related papers (2022-08-26T16:47:48Z) - Sustainability using Renewable Electricity (SuRE) towards NetZero
Emissions [0.0]
Growth in energy demand poses serious threat to the environment.
Most of the energy sources are non-renewable and based on fossil fuels, which leads to emission of harmful greenhouse gases.
We present a scalable AI based solution that can be used by organizations to increase their overall renewable electricity share in total energy consumption.
arXiv Detail & Related papers (2022-02-26T10:04:26Z) - Modelling the transition to a low-carbon energy supply [91.3755431537592]
A transition to a low-carbon electricity supply is crucial to limit the impacts of climate change.
Reducing carbon emissions could help prevent the world from reaching a tipping point, where runaway emissions are likely.
Runaway emissions could lead to extremes in weather conditions around the world.
arXiv Detail & Related papers (2021-09-25T12:37:05Z) - Exploring market power using deep reinforcement learning for intelligent
bidding strategies [69.3939291118954]
We find that capacity has an impact on the average electricity price in a single year.
The value of $sim$25% and $sim$11% may vary between market structures and countries.
We observe that the use of a market cap of approximately double the average market price has the effect of significantly decreasing this effect and maintaining a competitive market.
arXiv Detail & Related papers (2020-11-08T21:07:42Z) - A Multi-Agent Deep Reinforcement Learning Approach for a Distributed
Energy Marketplace in Smart Grids [58.666456917115056]
This paper presents a Reinforcement Learning based energy market for a prosumer dominated microgrid.
The proposed market model facilitates a real-time and demanddependent dynamic pricing environment, which reduces grid costs and improves the economic benefits for prosumers.
arXiv Detail & Related papers (2020-09-23T02:17:51Z) - A Deep Reinforcement Learning Framework for Continuous Intraday Market
Bidding [69.37299910149981]
A key component for the successful renewable energy sources integration is the usage of energy storage.
We propose a novel modelling framework for the strategic participation of energy storage in the European continuous intraday market.
An distributed version of the fitted Q algorithm is chosen for solving this problem due to its sample efficiency.
Results indicate that the agent converges to a policy that achieves in average higher total revenues than the benchmark strategy.
arXiv Detail & Related papers (2020-04-13T13:50:13Z) - Risk-Aware Energy Scheduling for Edge Computing with Microgrid: A
Multi-Agent Deep Reinforcement Learning Approach [82.6692222294594]
We study a risk-aware energy scheduling problem for a microgrid-powered MEC network.
We derive the solution by applying a multi-agent deep reinforcement learning (MADRL)-based advantage actor-critic (A3C) algorithm with shared neural networks.
arXiv Detail & Related papers (2020-02-21T02:14:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.