Deep operator network for surrogate modeling of poroelasticity with random permeability fields
- URL: http://arxiv.org/abs/2509.11966v1
- Date: Mon, 15 Sep 2025 14:18:49 GMT
- Title: Deep operator network for surrogate modeling of poroelasticity with random permeability fields
- Authors: Sangjoon Park, Yeonjong Shin, Jinhyun Choo,
- Abstract summary: Poroelasticity -- coupled fluid flow and elastic deformation in porous media -- often involves spatially variable permeability.<n>In this study, we propose a surrogate modeling framework based on the deep operator network (DeepONet), a neural architecture designed to learn mappings between infinite-dimensional function spaces.<n>To enhance predictive accuracy and stability, we integrate three strategies: nondimensionalization of the governing equations, input dimensionality reduction via Karhunen--Lo'eve expansion, and a two-step training procedure that decouples the optimization of branch and trunk networks.
- Score: 3.7214007898390196
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Poroelasticity -- coupled fluid flow and elastic deformation in porous media -- often involves spatially variable permeability, especially in subsurface systems. In such cases, simulations with random permeability fields are widely used for probabilistic analysis, uncertainty quantification, and inverse problems. These simulations require repeated forward solves that are often prohibitively expensive, motivating the development of efficient surrogate models. However, efficient surrogate modeling techniques for poroelasticity with random permeability fields remain scarce. In this study, we propose a surrogate modeling framework based on the deep operator network (DeepONet), a neural architecture designed to learn mappings between infinite-dimensional function spaces. The proposed surrogate model approximates the solution operator that maps random permeability fields to transient poroelastic responses. To enhance predictive accuracy and stability, we integrate three strategies: nondimensionalization of the governing equations, input dimensionality reduction via Karhunen--Lo\'eve expansion, and a two-step training procedure that decouples the optimization of branch and trunk networks. The methodology is evaluated on two benchmark problems in poroelasticity: soil consolidation and ground subsidence induced by groundwater extraction. In both cases, the DeepONet achieves substantial speedup in inference while maintaining high predictive accuracy across a wide range of permeability statistics. These results highlight the potential of the proposed approach as a scalable and efficient surrogate modeling technique for poroelastic systems with random permeability fields.
Related papers
- Spatially-informed transformers: Injecting geostatistical covariance biases into self-attention for spatio-temporal forecasting [0.0]
We propose a hybrid architecture that injects a geostatistic inductive bias directly into the decomposing self-attention mechanism via a learnable costatistics kernel.<n>We demonstrate the phenomenon of Deep Variography'', where the network successfully recovers the true spatial parameters of the underlying process end-to-end via backpropagation.
arXiv Detail & Related papers (2025-12-19T15:32:24Z) - Wasserstein Regression as a Variational Approximation of Probabilistic Trajectories through the Bernstein Basis [41.99844472131922]
Existing approaches often ignore the geometry of the probability space or are computationally expensive.<n>A new method is proposed that combines the parameterization of probability trajectories using a Bernstein basis and the minimization of the Wasserstein distance between distributions.<n>The developed approach combines geometric accuracy, computational practicality, and interpretability.
arXiv Detail & Related papers (2025-10-30T15:36:39Z) - Reframing Generative Models for Physical Systems using Stochastic Interpolants [45.16806809746592]
Generative models have emerged as powerful surrogates for physical systems, demonstrating increased accuracy, stability, and/or statistical fidelity.<n>Most approaches rely on iteratively denoising a Gaussian, a choice that may not be the most effective for autoregressive prediction tasks in PDEs and dynamical systems such as climate.<n>In this work, we benchmark generative models across diverse physical domains and tasks, and highlight the role of interpolants.
arXiv Detail & Related papers (2025-09-30T14:02:00Z) - Stochastic and Non-local Closure Modeling for Nonlinear Dynamical Systems via Latent Score-based Generative Models [0.0]
We propose a latent score-based generative AI framework for learning, non-local closure models and laws in nonlinear dynamical systems.<n>This work addresses a key challenge of modeling complex multiscale dynamical systems without a clear scale separation.
arXiv Detail & Related papers (2025-06-25T19:04:02Z) - Preconditioned Inexact Stochastic ADMM for Deep Model [35.37705488695026]
This paper develops an algorithm, PISA, which enables scalable parallel computing and supports various preconditions.<n>It converges under the sole assumption of Lipschitz continuity of the gradient on a bounded region, removing the need for other conditions commonly imposed by methods.<n>It demonstrates its superior numerical performance compared to various state-of-the-art iterations.
arXiv Detail & Related papers (2025-02-15T12:28:51Z) - Using Parametric PINNs for Predicting Internal and External Turbulent Flows [6.387263468033964]
We build upon the previously proposed RANS-PINN framework, which only focused on predicting flow over a cylinder.
We investigate its accuracy in predicting relevant turbulent flow variables for both internal and external flows.
arXiv Detail & Related papers (2024-10-24T17:08:20Z) - Distributionally Robust Model-based Reinforcement Learning with Large
State Spaces [55.14361269378122]
Three major challenges in reinforcement learning are the complex dynamical systems with large state spaces, the costly data acquisition processes, and the deviation of real-world dynamics from the training environment deployment.
We study distributionally robust Markov decision processes with continuous state spaces under the widely used Kullback-Leibler, chi-square, and total variation uncertainty sets.
We propose a model-based approach that utilizes Gaussian Processes and the maximum variance reduction algorithm to efficiently learn multi-output nominal transition dynamics.
arXiv Detail & Related papers (2023-09-05T13:42:11Z) - Dynamic Kernel-Based Adaptive Spatial Aggregation for Learned Image
Compression [63.56922682378755]
We focus on extending spatial aggregation capability and propose a dynamic kernel-based transform coding.
The proposed adaptive aggregation generates kernel offsets to capture valid information in the content-conditioned range to help transform.
Experimental results demonstrate that our method achieves superior rate-distortion performance on three benchmarks compared to the state-of-the-art learning-based methods.
arXiv Detail & Related papers (2023-08-17T01:34:51Z) - Physics-informed UNets for Discovering Hidden Elasticity in
Heterogeneous Materials [0.0]
We develop a novel UNet-based neural network model for inversion in elasticity (El-UNet)
We show superior performance, both in terms of accuracy and computational cost, by El-UNet compared to fully-connected physics-informed neural networks.
arXiv Detail & Related papers (2023-06-01T23:35:03Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Variational Hierarchical Mixtures for Probabilistic Learning of Inverse
Dynamics [20.953728061894044]
Well-calibrated probabilistic regression models are a crucial learning component in robotics applications as datasets grow rapidly and tasks become more complex.
We consider a probabilistic hierarchical modeling paradigm that combines the benefits of both worlds to deliver computationally efficient representations with inherent complexity regularization.
We derive two efficient variational inference techniques to learn these representations and highlight the advantages of hierarchical infinite local regression models.
arXiv Detail & Related papers (2022-11-02T13:54:07Z) - A Variational Infinite Mixture for Probabilistic Inverse Dynamics
Learning [34.90240171916858]
We develop an efficient variational Bayes inference technique for infinite mixtures of probabilistic local models.
We highlight the model's power in combining data-driven adaptation, fast prediction and the ability to deal with discontinuous functions and heteroscedastic noise.
We use the learned models for online dynamics control of a Barrett-WAM manipulator, significantly improving the trajectory tracking performance.
arXiv Detail & Related papers (2020-11-10T16:15:13Z) - SODEN: A Scalable Continuous-Time Survival Model through Ordinary
Differential Equation Networks [14.564168076456822]
We propose a flexible model for survival analysis using neural networks along with scalable optimization algorithms.
We demonstrate the effectiveness of the proposed method in comparison to existing state-of-the-art deep learning survival analysis models.
arXiv Detail & Related papers (2020-08-19T19:11:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.