Solving Forward and Inverse Problems of Contact Mechanics using
Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2308.12716v1
- Date: Thu, 24 Aug 2023 11:31:24 GMT
- Title: Solving Forward and Inverse Problems of Contact Mechanics using
Physics-Informed Neural Networks
- Authors: T. Sahin, M. von Danwitz, A. Popp
- Abstract summary: We deploy PINNs in a mixed-variable formulation enhanced by output transformation to enforce hard and soft constraints.
We show that PINNs can serve as pure partial equation (PDE) solver, as data-enhanced forward model, and as fast-to-evaluate surrogate model.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: This paper explores the ability of physics-informed neural networks (PINNs)
to solve forward and inverse problems of contact mechanics for small
deformation elasticity. We deploy PINNs in a mixed-variable formulation
enhanced by output transformation to enforce Dirichlet and Neumann boundary
conditions as hard constraints. Inequality constraints of contact problems,
namely Karush-Kuhn-Tucker (KKT) type conditions, are enforced as soft
constraints by incorporating them into the loss function during network
training. To formulate the loss function contribution of KKT constraints,
existing approaches applied to elastoplasticity problems are investigated and
we explore a nonlinear complementarity problem (NCP) function, namely
Fischer-Burmeister, which possesses advantageous characteristics in terms of
optimization. Based on the Hertzian contact problem, we show that PINNs can
serve as pure partial differential equation (PDE) solver, as data-enhanced
forward model, as inverse solver for parameter identification, and as
fast-to-evaluate surrogate model. Furthermore, we demonstrate the importance of
choosing proper hyperparameters, e.g. loss weights, and a combination of Adam
and L-BFGS-B optimizers aiming for better results in terms of accuracy and
training time.
Related papers
- DeltaPhi: Learning Physical Trajectory Residual for PDE Solving [54.13671100638092]
We propose and formulate the Physical Trajectory Residual Learning (DeltaPhi)
We learn the surrogate model for the residual operator mapping based on existing neural operator networks.
We conclude that, compared to direct learning, physical residual learning is preferred for PDE solving.
arXiv Detail & Related papers (2024-06-14T07:45:07Z) - Physics-aware deep learning framework for linear elasticity [0.0]
The paper presents an efficient and robust data-driven deep learning (DL) computational framework for linear continuum elasticity problems.
For an accurate representation of the field variables, a multi-objective loss function is proposed.
Several benchmark problems including the Airimaty solution to elasticity and the Kirchhoff-Love plate problem are solved.
arXiv Detail & Related papers (2023-02-19T20:33:32Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Physics-Informed Neural Network Method for Parabolic Differential
Equations with Sharply Perturbed Initial Conditions [68.8204255655161]
We develop a physics-informed neural network (PINN) model for parabolic problems with a sharply perturbed initial condition.
Localized large gradients in the ADE solution make the (common in PINN) Latin hypercube sampling of the equation's residual highly inefficient.
We propose criteria for weights in the loss function that produce a more accurate PINN solution than those obtained with the weights selected via other methods.
arXiv Detail & Related papers (2022-08-18T05:00:24Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Mitigating Learning Complexity in Physics and Equality Constrained
Artificial Neural Networks [0.9137554315375919]
Physics-informed neural networks (PINNs) have been proposed to learn the solution of partial differential equations (PDE)
In PINNs, the residual form of the PDE of interest and its boundary conditions are lumped into a composite objective function as soft penalties.
Here, we show that this specific way of formulating the objective function is the source of severe limitations in the PINN approach when applied to different kinds of PDEs.
arXiv Detail & Related papers (2022-06-19T04:12:01Z) - Lagrangian PINNs: A causality-conforming solution to failure modes of
physics-informed neural networks [5.8010446129208155]
Physics-informed neural networks (PINNs) leverage neural-networks to find the solutions of partial differential equation (PDE)-constrained optimization problems.
We show that the challenge of training persists even when the boundary conditions are strictly enforced.
We propose reformulating PINNs on a Lagrangian frame of reference, i.e., LPINNs, as a PDE-informed solution.
arXiv Detail & Related papers (2022-05-05T19:48:05Z) - Enhanced Physics-Informed Neural Networks with Augmented Lagrangian
Relaxation Method (AL-PINNs) [1.7403133838762446]
Physics-Informed Neural Networks (PINNs) are powerful approximators of solutions to nonlinear partial differential equations (PDEs)
We propose an Augmented Lagrangian relaxation method for PINNs (AL-PINNs)
We demonstrate through various numerical experiments that AL-PINNs yield a much smaller relative error compared with that of state-of-the-art adaptive loss-balancing algorithms.
arXiv Detail & Related papers (2022-04-29T08:33:11Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Physics-Informed Neural Operator for Learning Partial Differential
Equations [55.406540167010014]
PINO is the first hybrid approach incorporating data and PDE constraints at different resolutions to learn the operator.
The resulting PINO model can accurately approximate the ground-truth solution operator for many popular PDE families.
arXiv Detail & Related papers (2021-11-06T03:41:34Z) - Physics and Equality Constrained Artificial Neural Networks: Application
to Partial Differential Equations [1.370633147306388]
Physics-informed neural networks (PINNs) have been proposed to learn the solution of partial differential equations (PDE)
Here, we show that this specific way of formulating the objective function is the source of severe limitations in the PINN approach.
We propose a versatile framework that can tackle both inverse and forward problems.
arXiv Detail & Related papers (2021-09-30T05:55:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.