Surrogate-assisted level-based learning evolutionary search for heat
extraction optimization of enhanced geothermal system
- URL: http://arxiv.org/abs/2212.07666v3
- Date: Mon, 19 Dec 2022 01:42:59 GMT
- Title: Surrogate-assisted level-based learning evolutionary search for heat
extraction optimization of enhanced geothermal system
- Authors: Guodong Chen, Xin Luo, Chuanyin Jiang, Jiu Jimmy Jiao
- Abstract summary: An enhanced geothermal system is essential to provide sustainable and long-term geothermal energy supplies and reduce carbon emissions.
New surrogate-assisted level-based learning evolutionary search algorithm (SLLES) is proposed for heat extraction optimization of enhanced geothermal system.
- Score: 3.012067935276772
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: An enhanced geothermal system is essential to provide sustainable and
long-term geothermal energy supplies and reduce carbon emissions. Optimal
well-control scheme for effective heat extraction and improved heat sweep
efficiency plays a significant role in geothermal development. However, the
optimization performance of most existing optimization algorithms deteriorates
as dimension increases. To solve this issue, a novel surrogate-assisted
level-based learning evolutionary search algorithm (SLLES) is proposed for heat
extraction optimization of enhanced geothermal system. SLLES consists of
classifier-assisted level-based learning pre-screen part and local evolutionary
search part. The cooperation of the two parts has realized the balance between
the exploration and exploitation during the optimization process. After
iteratively sampling from the design space, the robustness and effectiveness of
the algorithm are proven to be improved significantly. To the best of our
knowledge, the proposed algorithm holds state-of-the-art simulation-involved
optimization framework. Comparative experiments have been conducted on
benchmark functions, a two-dimensional fractured reservoir and a
three-dimensional enhanced geothermal system. The proposed algorithm
outperforms other five state-of-the-art surrogate-assisted algorithms on all
selected benchmark functions. The results on the two heat extraction cases also
demonstrate that SLLES can achieve superior optimization performance compared
with traditional evolutionary algorithm and other surrogate-assisted
algorithms. This work lays a solid basis for efficient geothermal extraction of
enhanced geothermal system and sheds light on the model management strategies
of data-driven optimization in the areas of energy exploitation.
Related papers
- Machine Learning-Accelerated Multi-Objective Design of Fractured Geothermal Systems [17.040963667188525]
We report an Active Learning enhanced Evolutionary Multi-objective Optimization algorithm, integrated with hydrothermal simulations in fractured media.
Results demonstrate that the ALEMO approach achieves a remarkable reduction in required simulations, with a speed-up of 1-2 orders of magnitude (10-100 times faster) than traditional evolutionary methods.
arXiv Detail & Related papers (2024-11-01T10:39:23Z) - Discovering Preference Optimization Algorithms with and for Large Language Models [50.843710797024805]
offline preference optimization is a key method for enhancing and controlling the quality of Large Language Model (LLM) outputs.
We perform objective discovery to automatically discover new state-of-the-art preference optimization algorithms without (expert) human intervention.
Experiments demonstrate the state-of-the-art performance of DiscoPOP, a novel algorithm that adaptively blends logistic and exponential losses.
arXiv Detail & Related papers (2024-06-12T16:58:41Z) - The Firefighter Algorithm: A Hybrid Metaheuristic for Optimization Problems [3.2432648012273346]
The Firefighter Optimization (FFO) algorithm is a new hybrid metaheuristic for optimization problems.
To evaluate the performance of FFO, extensive experiments were conducted, wherein the FFO was examined against 13 commonly used optimization algorithms.
The results demonstrate that FFO achieves comparative performance and, in some scenarios, outperforms commonly adopted optimization algorithms in terms of the obtained fitness, time taken for exaction, and research space covered per unit of time.
arXiv Detail & Related papers (2024-06-01T18:38:59Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Machine learning based surrogate models for microchannel heat sink
optimization [0.0]
In this paper, microchannel designs with secondary channels and with ribs are investigated using computational fluid dynamics.
A workflow that combines Latin hypercube sampling, machine learning-based surrogate modeling and multi-objective optimization is proposed.
arXiv Detail & Related papers (2022-08-20T13:49:11Z) - On the Convergence of Distributed Stochastic Bilevel Optimization
Algorithms over a Network [55.56019538079826]
Bilevel optimization has been applied to a wide variety of machine learning models.
Most existing algorithms restrict their single-machine setting so that they are incapable of handling distributed data.
We develop novel decentralized bilevel optimization algorithms based on a gradient tracking communication mechanism and two different gradients.
arXiv Detail & Related papers (2022-06-30T05:29:52Z) - Deep learning based closed-loop optimization of geothermal reservoir
production [0.0]
We propose a closed-loop optimization framework, based on deep learning surrogates, for the well control optimization of geothermal reservoirs.
We construct a hybrid convolution-recurrent neural network surrogate, which combines the convolution neural network (CNN) and long short-term memory (LSTM) recurrent network.
We show that the proposed framework can achieve efficient and effective real-time optimization and data assimilation in the geothermal reservoir production process.
arXiv Detail & Related papers (2022-04-15T14:37:28Z) - Optimization and benchmarking of the thermal cycling algorithm [0.5879782260984691]
Most of the optimization problems have inordinately complex structures that render finding their daunting task.
In this paper we benchmark and improve an algorithm that is designed to overcome energy barriers in non optimization problems by temperature.
We demonstrate that it competes closely with other state-of-the-art algorithms such as parallel cycling with isoenergetic moves.
arXiv Detail & Related papers (2020-12-17T18:07:04Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Large Batch Training Does Not Need Warmup [111.07680619360528]
Training deep neural networks using a large batch size has shown promising results and benefits many real-world applications.
In this paper, we propose a novel Complete Layer-wise Adaptive Rate Scaling (CLARS) algorithm for large-batch training.
Based on our analysis, we bridge the gap and illustrate the theoretical insights for three popular large-batch training techniques.
arXiv Detail & Related papers (2020-02-04T23:03:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.