TunaOil: A Tuning Algorithm Strategy for Reservoir Simulation Workloads
- URL: http://arxiv.org/abs/2208.02606v1
- Date: Thu, 4 Aug 2022 12:11:13 GMT
- Title: TunaOil: A Tuning Algorithm Strategy for Reservoir Simulation Workloads
- Authors: Felipe Albuquerque Portella, David Buchaca Prats, Jos\'e Roberto
Pereira Rodrigues, Josep Llu\'is Berral
- Abstract summary: TunaOil is a new methodology to enhance the search for optimal numerical parameters of reservoir flow simulations.
We leverage ensembles of models in different oracles to extract information from each simulation and optimize the numerical parameters in their subsequent runs.
Our experiments show that the predictions can improve the overall workload on average by 31%.
- Score: 0.9940728137241215
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Reservoir simulations for petroleum fields and seismic imaging are known as
the most demanding workloads for high-performance computing (HPC) in the oil
and gas (O&G) industry. The optimization of the simulator numerical parameters
plays a vital role as it could save considerable computational efforts.
State-of-the-art optimization techniques are based on running numerous
simulations, specific for that purpose, to find good parameter candidates.
However, using such an approach is highly costly in terms of time and computing
resources. This work presents TunaOil, a new methodology to enhance the search
for optimal numerical parameters of reservoir flow simulations using a
performance model. In the O&G industry, it is common to use ensembles of models
in different workflows to reduce the uncertainty associated with forecasting
O&G production. We leverage the runs of those ensembles in such workflows to
extract information from each simulation and optimize the numerical parameters
in their subsequent runs.
To validate the methodology, we implemented it in a history matching (HM)
process that uses a Kalman filter algorithm to adjust an ensemble of reservoir
models to match the observed data from the real field. We mine past execution
logs from many simulations with different numerical configurations and build a
machine learning model based on extracted features from the data. These
features include properties of the reservoir models themselves, such as the
number of active cells, to statistics of the simulation's behavior, such as the
number of iterations of the linear solver. A sampling technique is used to
query the oracle to find the numerical parameters that can reduce the elapsed
time without significantly impacting the quality of the results. Our
experiments show that the predictions can improve the overall HM workflow
runtime on average by 31%.
Related papers
- Machine learning surrogates for efficient hydrologic modeling: Insights from stochastic simulations of managed aquifer recharge [0.0]
We propose a hybrid modeling workflow for process-based hydrologic models and machine learning surrogate models.
As a case study, we apply this workflow to simulations of variably saturated groundwater flow at a prospective managed aquifer recharge site.
Our results demonstrate that ML surrogate models can achieve under 10% mean absolute percentage error and yield order-of-magnitude runtime savings.
arXiv Detail & Related papers (2024-07-30T15:24:27Z) - Neural Operator-Based Proxy for Reservoir Simulations Considering Varying Well Settings, Locations, and Permeability Fields [0.0]
We present a single Fourier Neural Operator (FNO) surrogate that outperforms traditional reservoir simulators.
The maximum-mean relative error of 95% of pressure and saturation predictions is less than 5%.
The model can accelerate history matching and reservoir characterization procedures by several orders of magnitude.
arXiv Detail & Related papers (2024-07-13T00:26:14Z) - Machine Learning Optimized Approach for Parameter Selection in MESHFREE Simulations [0.0]
Meshfree simulation methods are emerging as compelling alternatives to conventional mesh-based approaches.
We provide a comprehensive overview of our research combining Machine Learning (ML) and Fraunhofer's MESHFREE software.
We introduce a novel ML-optimized approach, using active learning, regression trees, and visualization on MESHFREE simulation data.
arXiv Detail & Related papers (2024-03-20T15:29:59Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - Lamarr: LHCb ultra-fast simulation based on machine learning models deployed within Gauss [0.0]
We discuss Lamarr, a framework to speed-up the simulation production parameterizing both the detector response and the reconstruction algorithms of the LHCb experiment.
Deep Generative Models powered by several algorithms and strategies are employed to effectively parameterize the high-level response of the single components of the LHCb detector.
arXiv Detail & Related papers (2023-03-20T20:18:04Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Adaptive LASSO estimation for functional hidden dynamic geostatistical
model [69.10717733870575]
We propose a novel model selection algorithm based on a penalized maximum likelihood estimator (PMLE) for functional hiddenstatistical models (f-HD)
The algorithm is based on iterative optimisation and uses an adaptive least absolute shrinkage and selector operator (GMSOLAS) penalty function, wherein the weights are obtained by the unpenalised f-HD maximum-likelihood estimators.
arXiv Detail & Related papers (2022-08-10T19:17:45Z) - Learning Large-scale Subsurface Simulations with a Hybrid Graph Network
Simulator [57.57321628587564]
We introduce Hybrid Graph Network Simulator (HGNS) for learning reservoir simulations of 3D subsurface fluid flows.
HGNS consists of a subsurface graph neural network (SGNN) to model the evolution of fluid flows, and a 3D-U-Net to model the evolution of pressure.
Using an industry-standard subsurface flow dataset (SPE-10) with 1.1 million cells, we demonstrate that HGNS is able to reduce the inference time up to 18 times compared to standard subsurface simulators.
arXiv Detail & Related papers (2022-06-15T17:29:57Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z) - Scalable nonparametric Bayesian learning for heterogeneous and dynamic
velocity fields [8.744017403796406]
We develop a model for learning heterogeneous and dynamic patterns of velocity field data.
We show the effectiveness of our techniques to the NGSIM dataset of complex multi-vehicle interactions.
arXiv Detail & Related papers (2021-02-15T17:45:46Z) - A User's Guide to Calibrating Robotics Simulators [54.85241102329546]
This paper proposes a set of benchmarks and a framework for the study of various algorithms aimed to transfer models and policies learnt in simulation to the real world.
We conduct experiments on a wide range of well known simulated environments to characterize and offer insights into the performance of different algorithms.
Our analysis can be useful for practitioners working in this area and can help make informed choices about the behavior and main properties of sim-to-real algorithms.
arXiv Detail & Related papers (2020-11-17T22:24:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.