Real-Time Well Log Prediction From Drilling Data Using Deep Learning
- URL: http://arxiv.org/abs/2001.10156v1
- Date: Tue, 28 Jan 2020 03:57:31 GMT
- Title: Real-Time Well Log Prediction From Drilling Data Using Deep Learning
- Authors: Rayan Kanfar, Obai Shaikh, Mehrdad Yousefzadeh, Tapan Mukerji
- Abstract summary: We present a workflow for data augmentation and feature engineering using Distance-based Global Sensitivity Analysis.
We propose an Inception-based Convolutional Neural Network combined with a Temporal Convolutional Network as the deep learning model.
12 wells from the Equinor dataset for the Volve field in the North Sea are used for learning.
- Score: 2.064612766965483
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The objective is to study the feasibility of predicting subsurface rock
properties in wells from real-time drilling data. Geophysical logs, namely,
density, porosity and sonic logs are of paramount importance for subsurface
resource estimation and exploitation. These wireline petro-physical
measurements are selectively deployed as they are expensive to acquire;
meanwhile, drilling information is recorded in every drilled well. Hence a
predictive tool for wireline log prediction from drilling data can help
management make decisions about data acquisition, especially for delineation
and production wells. This problem is non-linear with strong ineractions
between drilling parameters; hence the potential for deep learning to address
this problem is explored. We present a workflow for data augmentation and
feature engineering using Distance-based Global Sensitivity Analysis. We
propose an Inception-based Convolutional Neural Network combined with a
Temporal Convolutional Network as the deep learning model. The model is
designed to learn both low and high frequency content of the data. 12 wells
from the Equinor dataset for the Volve field in the North Sea are used for
learning. The model predictions not only capture trends but are also physically
consistent across density, porosity, and sonic logs. On the test data, the mean
square error reaches a low value of 0.04 but the correlation coefficient
plateaus around 0.6. The model is able however to differentiate between
different types of rocks such as cemented sandstone, unconsolidated sands, and
shale.
Related papers
- Learning with Noisy Foundation Models [95.50968225050012]
This paper is the first work to comprehensively understand and analyze the nature of noise in pre-training datasets.
We propose a tuning method (NMTune) to affine the feature space to mitigate the malignant effect of noise and improve generalization.
arXiv Detail & Related papers (2024-03-11T16:22:41Z) - Data-driven prediction of tool wear using Bayesian-regularized
artificial neural networks [8.21266434543609]
The prediction of tool wear helps minimize costs and enhance product quality in manufacturing.
We propose a new data-driven model that uses Bayesian Regularized Artificial Neural Networks (BRANNs) to precisely predict milling tool wear.
arXiv Detail & Related papers (2023-11-30T15:22:20Z) - Uncertainty and Explainable Analysis of Machine Learning Model for
Reconstruction of Sonic Slowness Logs [5.815454346817298]
We use data from the 2020 machine learning competition of the SPWLA to predict the missing compressional wave slowness and shear wave slowness logs.
We employ the NGBoost algorithm to construct an Ensemble Learning model that can predicate the results as well as their uncertainty.
Our findings reveal that the NGBoost model tends to provide greater slowness prediction results when the neutron porosity and gamma ray are large.
arXiv Detail & Related papers (2023-08-24T08:03:15Z) - Exploring the Effectiveness of Dataset Synthesis: An application of
Apple Detection in Orchards [68.95806641664713]
We explore the usability of Stable Diffusion 2.1-base for generating synthetic datasets of apple trees for object detection.
We train a YOLOv5m object detection model to predict apples in a real-world apple detection dataset.
Results demonstrate that the model trained on generated data is slightly underperforming compared to a baseline model trained on real-world images.
arXiv Detail & Related papers (2023-06-20T09:46:01Z) - Investigation of the Robustness of Neural Density Fields [7.67602635520562]
This work investigates neural density fields and their relative errors in the context of robustness to external factors like noise or constraints during training.
It is found that both models trained on a polyhedral and mascon ground truth perform similarly, indicating that the ground truth is not the accuracy bottleneck.
arXiv Detail & Related papers (2023-05-31T09:43:49Z) - Deep Active Learning with Noise Stability [24.54974925491753]
Uncertainty estimation for unlabeled data is crucial to active learning.
We propose a novel algorithm that leverages noise stability to estimate data uncertainty.
Our method is generally applicable in various tasks, including computer vision, natural language processing, and structural data analysis.
arXiv Detail & Related papers (2022-05-26T13:21:01Z) - Distributionally Robust Semi-Supervised Learning Over Graphs [68.29280230284712]
Semi-supervised learning (SSL) over graph-structured data emerges in many network science applications.
To efficiently manage learning over graphs, variants of graph neural networks (GNNs) have been developed recently.
Despite their success in practice, most of existing methods are unable to handle graphs with uncertain nodal attributes.
Challenges also arise due to distributional uncertainties associated with data acquired by noisy measurements.
A distributionally robust learning framework is developed, where the objective is to train models that exhibit quantifiable robustness against perturbations.
arXiv Detail & Related papers (2021-10-20T14:23:54Z) - DeepSatData: Building large scale datasets of satellite images for
training machine learning models [77.17638664503215]
This report presents design considerations for automatically generating satellite imagery datasets for training machine learning models.
We discuss issues faced from the point of view of deep neural network training and evaluation.
arXiv Detail & Related papers (2021-04-28T15:13:12Z) - HYDRA: Hypergradient Data Relevance Analysis for Interpreting Deep
Neural Networks [51.143054943431665]
We propose Hypergradient Data Relevance Analysis, or HYDRA, which interprets predictions made by deep neural networks (DNNs) as effects of their training data.
HYDRA assesses the contribution of training data toward test data points throughout the training trajectory.
In addition, we quantitatively demonstrate that HYDRA outperforms influence functions in accurately estimating data contribution and detecting noisy data labels.
arXiv Detail & Related papers (2021-02-04T10:00:13Z) - Machine Learning for Gas and Oil Exploration [0.0]
Well logs contain various characteristics of the rock around the borehole, which allow petrophysicists to determine the expected amount of hydrocarbon.
These logs are often incomplete and, as a consequence, the subsequent analyses cannot exploit the full potential of the well logs.
In this paper we demonstrate that Machine Learning can be applied to emphfill in the gaps and estimate missing values.
We then explore the models' predictions both quantitatively, tracking the prediction error, and qualitatively, capturing the evolution of the measured and predicted values for a given property with depth.
arXiv Detail & Related papers (2020-10-04T11:03:17Z) - Graph Embedding with Data Uncertainty [113.39838145450007]
spectral-based subspace learning is a common data preprocessing step in many machine learning pipelines.
Most subspace learning methods do not take into consideration possible measurement inaccuracies or artifacts that can lead to data with high uncertainty.
arXiv Detail & Related papers (2020-09-01T15:08:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.