Deep Learning of Dynamic Subsurface Flow via Theory-guided Generative
Adversarial Network
- URL: http://arxiv.org/abs/2006.13305v1
- Date: Tue, 2 Jun 2020 02:53:26 GMT
- Title: Deep Learning of Dynamic Subsurface Flow via Theory-guided Generative
Adversarial Network
- Authors: Tianhao He and Dongxiao Zhang
- Abstract summary: Theory-guided generative adversarial network (TgGAN) is proposed to solve dynamic partial differential equations (PDEs)
TgGAN is proposed for dynamic subsurface flow with heterogeneous model parameters.
Numerical results demonstrate that the TgGAN model is robust and reliable for deep learning of dynamic PDEs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative adversarial network (GAN) has been shown to be useful in various
applications, such as image recognition, text processing and scientific
computing, due its strong ability to learn complex data distributions. In this
study, a theory-guided generative adversarial network (TgGAN) is proposed to
solve dynamic partial differential equations (PDEs). Different from standard
GANs, the training term is no longer the true data and the generated data, but
rather their residuals. In addition, such theories as governing equations,
other physical constraints and engineering controls, are encoded into the loss
function of the generator to ensure that the prediction does not only honor the
training data, but also obey these theories. TgGAN is proposed for dynamic
subsurface flow with heterogeneous model parameters, and the data at each time
step are treated as a two-dimensional image. In this study, several numerical
cases are introduced to test the performance of the TgGAN. Predicting the
future response, label-free learning and learning from noisy data can be
realized easily by the TgGAN model. The effects of the number of training data
and the collocation points are also discussed. In order to improve the
efficiency of TgGAN, the transfer learning algorithm is also employed.
Numerical results demonstrate that the TgGAN model is robust and reliable for
deep learning of dynamic PDEs.
Related papers
- DFA-GNN: Forward Learning of Graph Neural Networks by Direct Feedback Alignment [57.62885438406724]
Graph neural networks are recognized for their strong performance across various applications.
BP has limitations that challenge its biological plausibility and affect the efficiency, scalability and parallelism of training neural networks for graph-based tasks.
We propose DFA-GNN, a novel forward learning framework tailored for GNNs with a case study of semi-supervised learning.
arXiv Detail & Related papers (2024-06-04T07:24:51Z) - Incorporating Domain Differential Equations into Graph Convolutional Networks to Lower Generalization Discrepancy [30.249981848630256]
We show how to incorporate domain differential equations into Graph Convolutional Networks (GCNs)
We propose two domain-differential-temporal-tion-informed networks called ReactionDiffusion Graph Convolutional Network (RDGCN)
We experimentally show that RDGCN and SIRGCN are more robust with mismatched testing data than the state-of-the-art deep learning methods.
arXiv Detail & Related papers (2024-04-01T16:17:11Z) - Diffusion-Based Neural Network Weights Generation [80.89706112736353]
D2NWG is a diffusion-based neural network weights generation technique that efficiently produces high-performing weights for transfer learning.
Our method extends generative hyper-representation learning to recast the latent diffusion paradigm for neural network weights generation.
Our approach is scalable to large architectures such as large language models (LLMs), overcoming the limitations of current parameter generation techniques.
arXiv Detail & Related papers (2024-02-28T08:34:23Z) - Recurrent neural networks and transfer learning for elasto-plasticity in
woven composites [0.0]
This article presents Recurrent Neural Network (RNN) models as a surrogate for computationally intensive meso-scale simulation of woven composites.
A mean-field model generates a comprehensive data set representing elasto-plastic behavior.
In simulations, arbitrary six-dimensional strain histories are used to predict stresses under random walking as the source task and cyclic loading conditions as the target task.
arXiv Detail & Related papers (2023-11-22T14:47:54Z) - Tipping Points of Evolving Epidemiological Networks: Machine
Learning-Assisted, Data-Driven Effective Modeling [0.0]
We study the tipping point collective dynamics of an adaptive susceptible-infected (SIS) epidemiological network in a data-driven, machine learning-assisted manner.
We identify a complex effective differential equation (eSDE) in terms physically meaningful coarse mean-field variables.
We study the statistics of rare events both through repeated brute force simulations and by using established mathematical/computational tools.
arXiv Detail & Related papers (2023-11-01T19:33:03Z) - Training Deep Surrogate Models with Large Scale Online Learning [48.7576911714538]
Deep learning algorithms have emerged as a viable alternative for obtaining fast solutions for PDEs.
Models are usually trained on synthetic data generated by solvers, stored on disk and read back for training.
It proposes an open source online training framework for deep surrogate models.
arXiv Detail & Related papers (2023-06-28T12:02:27Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Convolutional generative adversarial imputation networks for
spatio-temporal missing data in storm surge simulations [86.5302150777089]
Generative Adversarial Imputation Nets (GANs) and GAN-based techniques have attracted attention as unsupervised machine learning methods.
We name our proposed method as Con Conval Generative Adversarial Imputation Nets (Conv-GAIN)
arXiv Detail & Related papers (2021-11-03T03:50:48Z) - Understanding Self-supervised Learning with Dual Deep Networks [74.92916579635336]
We propose a novel framework to understand contrastive self-supervised learning (SSL) methods that employ dual pairs of deep ReLU networks.
We prove that in each SGD update of SimCLR with various loss functions, the weights at each layer are updated by a emphcovariance operator.
To further study what role the covariance operator plays and which features are learned in such a process, we model data generation and augmentation processes through a emphhierarchical latent tree model (HLTM)
arXiv Detail & Related papers (2020-10-01T17:51:49Z) - Turbulence Enrichment using Physics-informed Generative Adversarial
Networks [0.0]
We develop methods for generative enrichment of turbulence.
We incorporate a physics-informed learning approach by a modification to the loss function.
We show that using the physics-informed learning can also significantly improve the model's ability in generating data that satisfies the physical governing equations.
arXiv Detail & Related papers (2020-03-04T06:14:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.