Physics informed neural networks for continuum micromechanics
- URL: http://arxiv.org/abs/2110.07374v1
- Date: Thu, 14 Oct 2021 14:05:19 GMT
- Title: Physics informed neural networks for continuum micromechanics
- Authors: Alexander Henkes, Henning Wessels, Rolf Mahnken
- Abstract summary: Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
- Score: 68.8204255655161
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, physics informed neural networks have successfully been applied to
a broad variety of problems in applied mathematics and engineering. The
principle idea is to use a neural network as a global ansatz function to
partial differential equations. Due to the global approximation, physics
informed neural networks have difficulties in displaying localized effects and
strong non-linear solutions by optimization. In this work we consider material
non-linearities invoked by material inhomogeneities with sharp phase
interfaces. This constitutes a challenging problem for a method relying on a
global ansatz. To overcome convergence issues, adaptive training strategies and
domain decomposition are studied. It is shown, that the domain decomposition
approach is able to accurately resolve nonlinear stress, displacement and
energy fields in heterogeneous microstructures obtained from real-world
$\mu$CT-scans.
Related papers
- An Analysis of Physics-Informed Neural Networks [0.0]
We present a new approach to approximating the solution to physical systems - physics-informed neural networks.
The concept of artificial neural networks is introduced, the objective function is defined, and optimisation strategies are discussed.
The partial differential equation is then included as a constraint in the loss function for the problem, giving the network access to knowledge of the dynamics of the physical system it is modelling.
arXiv Detail & Related papers (2023-03-06T04:45:53Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Understanding the Difficulty of Training Physics-Informed Neural
Networks on Dynamical Systems [5.878411350387833]
Physics-informed neural networks (PINNs) seamlessly integrate data and physical constraints into the solving of problems governed by differential equations.
We study the physics loss function in the vicinity of fixed points of dynamical systems.
We find that reducing the computational domain lowers the optimization complexity and chance of getting trapped with nonphysical solutions.
arXiv Detail & Related papers (2022-03-25T13:50:14Z) - Physics-informed ConvNet: Learning Physical Field from a Shallow Neural
Network [0.180476943513092]
Modelling and forecasting multi-physical systems remain a challenge due to unavoidable data scarcity and noise.
New framework named physics-informed convolutional network (PICN) is recommended from a CNN perspective.
PICN may become an alternative neural network solver in physics-informed machine learning.
arXiv Detail & Related papers (2022-01-26T14:35:58Z) - DeepPhysics: a physics aware deep learning framework for real-time
simulation [0.0]
We propose a solution to simulate hyper-elastic materials using a data-driven approach.
A neural network is trained to learn the non-linear relationship between boundary conditions and the resulting displacement field.
The results show that our network architecture trained with a limited amount of data can predict the displacement field in less than a millisecond.
arXiv Detail & Related papers (2021-09-17T12:15:47Z) - Conditional physics informed neural networks [85.48030573849712]
We introduce conditional PINNs (physics informed neural networks) for estimating the solution of classes of eigenvalue problems.
We show that a single deep neural network can learn the solution of partial differential equations for an entire class of problems.
arXiv Detail & Related papers (2021-04-06T18:29:14Z) - Analysis of three dimensional potential problems in non-homogeneous
media with physics-informed deep collocation method using material transfer
learning and sensitivity analysis [1.5749416770494704]
This work utilizes a physics informed neural network with material transfer learning reducing the solution of the nonhomogeneous partial differential equations to an optimization problem.
A material transfer learning technique is utilised for nonhomogeneous media with different material gradations and parameters, which enhance the generality and robustness of the proposed method.
arXiv Detail & Related papers (2020-10-03T20:29:25Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Generalization bound of globally optimal non-convex neural network
training: Transportation map estimation by infinite dimensional Langevin
dynamics [50.83356836818667]
We introduce a new theoretical framework to analyze deep learning optimization with connection to its generalization error.
Existing frameworks such as mean field theory and neural tangent kernel theory for neural network optimization analysis typically require taking limit of infinite width of the network to show its global convergence.
arXiv Detail & Related papers (2020-07-11T18:19:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.