Accelerated Solutions of Coupled Phase-Field Problems using Generative
Adversarial Networks
- URL: http://arxiv.org/abs/2211.12084v2
- Date: Wed, 23 Nov 2022 05:42:58 GMT
- Title: Accelerated Solutions of Coupled Phase-Field Problems using Generative
Adversarial Networks
- Authors: Vir Karan, A. Maruthi Indresh, Saswata Bhattacharyya
- Abstract summary: We develop a new neural network based framework that uses encoder-decoder based conditional GeneLSTM layers to solve a system of Cahn-Hilliard microstructural equations.
We show that the trained models are mesh and scale-independent, thereby warranting application as effective neural operators.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multiphysics problems such as multicomponent diffusion, phase transformations
in multiphase systems and alloy solidification involve numerical solution of a
coupled system of nonlinear partial differential equations (PDEs). Numerical
solutions of these PDEs using mesh-based methods require spatiotemporal
discretization of these equations. Hence, the numerical solutions are often
sensitive to discretization parameters and may have inaccuracies (resulting
from grid-based approximations). Moreover, choice of finer mesh for higher
accuracy make these methods computationally expensive. Neural network-based PDE
solvers are emerging as robust alternatives to conventional numerical methods
because these use machine learnable structures that are grid-independent, fast
and accurate. However, neural network based solvers require large amount of
training data, thus affecting their generalizabilty and scalability. These
concerns become more acute for coupled systems of time-dependent PDEs. To
address these issues, we develop a new neural network based framework that uses
encoder-decoder based conditional Generative Adversarial Networks with ConvLSTM
layers to solve a system of Cahn-Hilliard equations. These equations govern
microstructural evolution of a ternary alloy undergoing spinodal decomposition
when quenched inside a three-phase miscibility gap. We show that the trained
models are mesh and scale-independent, thereby warranting application as
effective neural operators.
Related papers
- Solving partial differential equations with sampled neural networks [1.8590821261905535]
Approximation of solutions to partial differential equations (PDE) is an important problem in computational science and engineering.
We discuss how sampling the hidden weights and biases of the ansatz network from data-agnostic and data-dependent probability distributions allows us to progress on both challenges.
arXiv Detail & Related papers (2024-05-31T14:24:39Z) - Physics-Informed Generator-Encoder Adversarial Networks with Latent
Space Matching for Stochastic Differential Equations [14.999611448900822]
We propose a new class of physics-informed neural networks to address the challenges posed by forward, inverse, and mixed problems in differential equations.
Our model consists of two key components: the generator and the encoder, both updated alternately by gradient descent.
In contrast to previous approaches, we employ an indirect matching that operates within the lower-dimensional latent feature space.
arXiv Detail & Related papers (2023-11-03T04:29:49Z) - Slow Invariant Manifolds of Singularly Perturbed Systems via
Physics-Informed Machine Learning [0.0]
We present a physics-informed machine-learning (PIML) approach for the approximation of slow invariant manifold (SIMs) of singularly perturbed systems.
We show that the proposed PIML scheme provides approximations, of equivalent or even higher accuracy, than those provided by other traditional GSPT-based methods.
A comparison of the computational costs between symbolic, automatic and numerical approximation of the required derivatives in the learning process is also provided.
arXiv Detail & Related papers (2023-09-14T14:10:22Z) - Deep Learning-based surrogate models for parametrized PDEs: handling
geometric variability through graph neural networks [0.0]
This work explores the potential usage of graph neural networks (GNNs) for the simulation of time-dependent PDEs.
We propose a systematic strategy to build surrogate models based on a data-driven time-stepping scheme.
We show that GNNs can provide a valid alternative to traditional surrogate models in terms of computational efficiency and generalization to new scenarios.
arXiv Detail & Related papers (2023-08-03T08:14:28Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.