Neural Operators for Mathematical Modeling of Transient Fluid Flow in Subsurface Reservoir Systems
- URL: http://arxiv.org/abs/2509.21485v1
- Date: Thu, 25 Sep 2025 19:45:07 GMT
- Title: Neural Operators for Mathematical Modeling of Transient Fluid Flow in Subsurface Reservoir Systems
- Authors: Daniil D. Sirota, Sergey A. Khan, Sergey L. Kostikov, Kirill A. Butov,
- Abstract summary: This paper presents a method for modeling transient fluid flow in subsurface reservoir systems based on the developed neural operator architecture (TFNO-opt)<n>The proposed architecture is based on Fourier neural operators, which allow approximating PDE solutions in infinite-dimensional functional spaces.<n>The effectiveness of the proposed improvements is confirmed by computational experiments.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a method for modeling transient fluid flow in subsurface reservoir systems based on the developed neural operator architecture (TFNO-opt). Reservoir systems are complex dynamic objects with distributed parameters described by systems of partial differential equations (PDEs). Traditional numerical methods for modeling such systems, despite their high accuracy, are characterized by significant time costs for performing calculations, which limits their applicability in control and decision support problems. The proposed architecture (TFNO-opt) is based on Fourier neural operators, which allow approximating PDE solutions in infinite-dimensional functional spaces, providing invariance to discretization and the possibility of generalization to various implementations of equations. The developed modifications are aimed at increasing the accuracy and stability of the trained neural operator, which is especially important for control problems. These include adjustable internal time resolution of the integral Fourier operator, tensor decomposition of parameters in the spectral domain, use of the Sobolev norm in the error function, and separation of approximation errors and reconstruction of initial conditions for more accurate reproduction of physical processes. The effectiveness of the proposed improvements is confirmed by computational experiments. The practical significance is confirmed by computational experiments using the example of the problem of hydrodynamic modeling of an underground gas storage (UGS), where the acceleration of calculations by six orders of magnitude was achieved, compared to traditional methods. This opens up new opportunities for the effective control of complex reservoir systems.
Related papers
- Physics-Informed Chebyshev Polynomial Neural Operator for Parametric Partial Differential Equations [17.758049557300826]
We introduce the Physics-Informed Chebyshev Polynomial Neural Operator (CPNO)<n>CPNO replaces unstable monomial expansions with numerically stable Chebyshev spectral basis.<n> Experiments on benchmark parameterized PDEs show that CPNO achieves superior accuracy, faster convergence, and enhanced robustness to hyper parameters.
arXiv Detail & Related papers (2026-02-02T07:19:56Z) - Physics-informed low-rank neural operators with application to parametric elliptic PDEs [0.0]
We present PILNO, a neural operator framework for approximating solution operators of partial differential equations (PDEs) on point cloud data.<n>PILNO combines low-rank kernel approximations with an encoder--decoder architecture, enabling fast, continuous one-shot predictions while remaining independent of specific discretizations.<n>We demonstrate its effectiveness on diverse problems, including function fitting, the Poisson equation, the screened Poisson equation with variable coefficients, and parameterized Darcy flow.
arXiv Detail & Related papers (2025-09-09T12:54:06Z) - Gaussian process surrogate with physical law-corrected prior for multi-coupled PDEs defined on irregular geometry [3.3798563347021093]
Parametric partial differential equations (PDEs) are fundamental mathematical tools for modeling complex physical systems.<n>We propose a novel physical law-corrected prior Gaussian process (LC-prior GP) surrogate modeling framework.
arXiv Detail & Related papers (2025-09-01T02:40:32Z) - Efficient Transformed Gaussian Process State-Space Models for Non-Stationary High-Dimensional Dynamical Systems [49.819436680336786]
We propose an efficient transformed Gaussian process state-space model (ETGPSSM) for scalable and flexible modeling of high-dimensional, non-stationary dynamical systems.<n>Specifically, our ETGPSSM integrates a single shared GP with input-dependent normalizing flows, yielding an expressive implicit process prior that captures complex, non-stationary transition dynamics.<n>Our ETGPSSM outperforms existing GPSSMs and neural network-based SSMs in terms of computational efficiency and accuracy.
arXiv Detail & Related papers (2025-03-24T03:19:45Z) - EFiGP: Eigen-Fourier Physics-Informed Gaussian Process for Inference of Dynamic Systems [0.9361474110798144]
estimation and trajectory reconstruction for data-driven dynamical systems governed by ordinary differential equations (ODEs) are essential tasks in fields such as biology, engineering, and physics.<n>We propose the Eigen-Fourier Physics-Informed Gaussian Process (EFiGP), an algorithm that integrates Fourier transformation and eigen-decomposition into a physics-informed Gaussian Process framework.
arXiv Detail & Related papers (2025-01-23T21:35:02Z) - Diffusion models as probabilistic neural operators for recovering unobserved states of dynamical systems [49.2319247825857]
We show that diffusion-based generative models exhibit many properties favourable for neural operators.<n>We propose to train a single model adaptable to multiple tasks, by alternating between the tasks during training.
arXiv Detail & Related papers (2024-05-11T21:23:55Z) - Differentiable DG with Neural Operator Source Term Correction [0.0]
We introduce an end-to-end differentiable framework for solving the compressible Navier-Stokes equations.<n>This integrated approach combines a differentiable discontinuous Galerkin solver with a neural network source term.<n>We demonstrate the performance of the proposed framework through two examples.
arXiv Detail & Related papers (2023-10-29T04:26:23Z) - Score-based Diffusion Models in Function Space [137.70916238028306]
Diffusion models have recently emerged as a powerful framework for generative modeling.<n>This work introduces a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.<n>We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.<n>We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.<n>Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Reduced order modeling of parametrized systems through autoencoders and
SINDy approach: continuation of periodic solutions [0.0]
This work presents a data-driven, non-intrusive framework which combines ROM construction with reduced dynamics identification.
The proposed approach leverages autoencoder neural networks with parametric sparse identification of nonlinear dynamics (SINDy) to construct a low-dimensional dynamical model.
These aim at tracking the evolution of periodic steady-state responses as functions of system parameters, avoiding the computation of the transient phase, and allowing to detect instabilities and bifurcations.
arXiv Detail & Related papers (2022-11-13T01:57:18Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.