Graph Network Surrogate Model for Subsurface Flow Optimization
- URL: http://arxiv.org/abs/2312.08625v2
- Date: Wed, 15 May 2024 03:58:12 GMT
- Title: Graph Network Surrogate Model for Subsurface Flow Optimization
- Authors: Haoyu Tang, Louis J. Durlofsky,
- Abstract summary: optimization of well locations and controls is an important step in the design of subsurface flow operations.
We propose a graph network surrogate model (GNSM) for optimizing well placement and controls.
GNSM transforms the flow model into a computational graph that involves an encoding-processing-decoding architecture.
- Score: 2.5782420501870296
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The optimization of well locations and controls is an important step in the design of subsurface flow operations such as oil production or geological CO2 storage. These optimization problems can be computationally expensive, however, as many potential candidate solutions must be evaluated. In this study, we propose a graph network surrogate model (GNSM) for optimizing well placement and controls. The GNSM transforms the flow model into a computational graph that involves an encoding-processing-decoding architecture. Separate networks are constructed to provide global predictions for the pressure and saturation state variables. Model performance is enhanced through the inclusion of the single-phase steady-state pressure solution as a feature. A multistage multistep strategy is used for training. The trained GNSM is applied to predict flow responses in a 2D unstructured model of a channelized reservoir. Results are presented for a large set of test cases, in which five injection wells and five production wells are placed randomly throughout the model, with a random control variable (bottom-hole pressure) assigned to each well. Median relative error in pressure and saturation for 300 such test cases is 1-2%. The ability of the trained GNSM to provide accurate predictions for a new (geologically similar) permeability realization is demonstrated. Finally, the trained GNSM is used to optimize well locations and controls with a differential evolution algorithm. GNSM-based optimization results are comparable to those from simulation-based optimization, with a runtime speedup of a factor of 36. Much larger speedups are expected if the method is used for robust optimization, in which each candidate solution is evaluated on multiple geological models.
Related papers
- Diffusion Models as Network Optimizers: Explorations and Analysis [71.69869025878856]
generative diffusion models (GDMs) have emerged as a promising new approach to network optimization.
In this study, we first explore the intrinsic characteristics of generative models.
We provide a concise theoretical and intuitive demonstration of the advantages of generative models over discriminative network optimization.
arXiv Detail & Related papers (2024-11-01T09:05:47Z) - OPUS: Occupancy Prediction Using a Sparse Set [64.60854562502523]
We present a framework to simultaneously predict occupied locations and classes using a set of learnable queries.
OPUS incorporates a suite of non-trivial strategies to enhance model performance.
Our lightest model achieves superior RayIoU on the Occ3D-nuScenes dataset at near 2x FPS, while our heaviest model surpasses previous best results by 6.1 RayIoU.
arXiv Detail & Related papers (2024-09-14T07:44:22Z) - Benchmarking Optimizers for Qumode State Preparation with Variational Quantum Algorithms [10.941053143198092]
There has been a growing interest in qumodes due to advancements in the field and their potential applications.
This paper aims to bridge this gap by providing performance benchmarks of various parameters used in state preparation with Variational Quantum Algorithms.
arXiv Detail & Related papers (2024-05-07T17:15:58Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - Sample-Efficient and Surrogate-Based Design Optimization of Underwater Vehicle Hulls [0.4543820534430522]
We show that theBO-LCB algorithm is the most sample-efficient optimization framework and has the best convergence behavior of those considered.
We also show that our DNN-based surrogate model predicts drag force on test data in tight agreement with CFD simulations, with a mean absolute percentage error (MAPE) of 1.85%.
We demonstrate a two-orders-of-magnitude speedup for the design optimization process when the surrogate model is used.
arXiv Detail & Related papers (2023-04-24T19:52:42Z) - Data-driven evolutionary algorithm for oil reservoir well-placement and
control optimization [3.012067935276772]
Generalized data-driven evolutionary algorithm (GDDE) is proposed to reduce the number of simulation runs on well-placement and control optimization problems.
Probabilistic neural network (PNN) is adopted as the classifier to select informative and promising candidates.
arXiv Detail & Related papers (2022-06-07T09:07:49Z) - Convolutional-Recurrent Neural Network Proxy for Robust Optimization and
Closed-Loop Reservoir Management [0.0]
A convolutional-recurrent neural network (CNN-RNN) proxy model is developed to predict well-by-well oil and water rates.
This capability enables the estimation of the objective function and nonlinear constraint values required for robust optimization.
arXiv Detail & Related papers (2022-03-14T22:11:17Z) - A Graph Neural Network Framework for Grid-Based Simulation [0.9137554315375922]
We propose a graph neural network (GNN) framework to build a surrogate feed-forward model which replaces simulation runs to accelerate the optimization process.
Our GNN framework shows great potential in the application of well-related subsurface optimization including oil and gas as well as carbon capture sequestration (CCS)
arXiv Detail & Related papers (2022-02-05T22:48:16Z) - A Primer on Zeroth-Order Optimization in Signal Processing and Machine
Learning [95.85269649177336]
ZO optimization iteratively performs three major steps: gradient estimation, descent direction, and solution update.
We demonstrate promising applications of ZO optimization, such as evaluating and generating explanations from black-box deep learning models, and efficient online sensor management.
arXiv Detail & Related papers (2020-06-11T06:50:35Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - Self-Directed Online Machine Learning for Topology Optimization [58.920693413667216]
Self-directed Online Learning Optimization integrates Deep Neural Network (DNN) with Finite Element Method (FEM) calculations.
Our algorithm was tested by four types of problems including compliance minimization, fluid-structure optimization, heat transfer enhancement and truss optimization.
It reduced the computational time by 2 5 orders of magnitude compared with directly using methods, and outperformed all state-of-the-art algorithms tested in our experiments.
arXiv Detail & Related papers (2020-02-04T20:00:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.