A deep learning framework for solution and discovery in solid mechanics
- URL: http://arxiv.org/abs/2003.02751v2
- Date: Wed, 6 May 2020 20:42:15 GMT
- Title: A deep learning framework for solution and discovery in solid mechanics
- Authors: Ehsan Haghighat, Maziar Raissi, Adrian Moure, Hector Gomez, Ruben
Juanes
- Abstract summary: We present the application of a class of deep learning, known as Physics Informed Neural Networks (PINN), to learning and discovery in solid mechanics.
We explain how to incorporate the momentum balance and elasticity relations into PINN, and explore in detail the application to linear elasticity.
- Score: 1.4699455652461721
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present the application of a class of deep learning, known as Physics
Informed Neural Networks (PINN), to learning and discovery in solid mechanics.
We explain how to incorporate the momentum balance and constitutive relations
into PINN, and explore in detail the application to linear elasticity, and
illustrate its extension to nonlinear problems through an example that
showcases von~Mises elastoplasticity. While common PINN algorithms are based on
training one deep neural network (DNN), we propose a multi-network model that
results in more accurate representation of the field variables. To validate the
model, we test the framework on synthetic data generated from analytical and
numerical reference solutions. We study convergence of the PINN model, and show
that Isogeometric Analysis (IGA) results in superior accuracy and convergence
characteristics compared with classic low-order Finite Element Method (FEM). We
also show the applicability of the framework for transfer learning, and find
vastly accelerated convergence during network re-training. Finally, we find
that honoring the physics leads to improved robustness: when trained only on a
few parameters, we find that the PINN model can accurately predict the solution
for a wide range of parameters new to the network---thus pointing to an
important application of this framework to sensitivity analysis and surrogate
modeling.
Related papers
- Adapting Physics-Informed Neural Networks for Bifurcation Detection in Ecological Migration Models [0.16442870218029523]
In this study, we explore the application of Physics-Informed Neural Networks (PINNs) to the analysis of bifurcation phenomena in ecological migration models.
By integrating the fundamental principles of diffusion-advection-reaction equations with deep learning techniques, we address the complexities of species migration dynamics.
arXiv Detail & Related papers (2024-09-01T08:00:31Z) - An Efficient Approach to Regression Problems with Tensor Neural Networks [5.345144592056051]
This paper introduces a tensor neural network (TNN) to address nonparametric regression problems.
The TNN demonstrates superior performance compared to conventional Feed-Forward Networks (FFN) and Radial Basis Function Networks (RBN)
A significant innovation in our approach is the integration of statistical regression and numerical integration within the TNN framework.
arXiv Detail & Related papers (2024-06-14T03:38:40Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Physics-aware deep learning framework for linear elasticity [0.0]
The paper presents an efficient and robust data-driven deep learning (DL) computational framework for linear continuum elasticity problems.
For an accurate representation of the field variables, a multi-objective loss function is proposed.
Several benchmark problems including the Airimaty solution to elasticity and the Kirchhoff-Love plate problem are solved.
arXiv Detail & Related papers (2023-02-19T20:33:32Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Learning to Learn with Generative Models of Neural Network Checkpoints [71.06722933442956]
We construct a dataset of neural network checkpoints and train a generative model on the parameters.
We find that our approach successfully generates parameters for a wide range of loss prompts.
We apply our method to different neural network architectures and tasks in supervised and reinforcement learning.
arXiv Detail & Related papers (2022-09-26T17:59:58Z) - Improving Parametric Neural Networks for High-Energy Physics (and
Beyond) [0.0]
We aim at deepening the understanding of Parametric Neural Network (pNN) networks in light of real-world usage.
We propose an alternative parametrization scheme, resulting in a new parametrized neural network architecture: the AffinePNN.
We extensively evaluate our models on the HEPMASS dataset, along its imbalanced version (called HEPMASS-IMB)
arXiv Detail & Related papers (2022-02-01T14:18:43Z) - Fusing the Old with the New: Learning Relative Camera Pose with
Geometry-Guided Uncertainty [91.0564497403256]
We present a novel framework that involves probabilistic fusion between the two families of predictions during network training.
Our network features a self-attention graph neural network, which drives the learning by enforcing strong interactions between different correspondences.
We propose motion parmeterizations suitable for learning and show that our method achieves state-of-the-art performance on the challenging DeMoN and ScanNet datasets.
arXiv Detail & Related papers (2021-04-16T17:59:06Z) - Stochastic analysis of heterogeneous porous material with modified
neural architecture search (NAS) based physics-informed neural networks using
transfer learning [0.0]
modified neural architecture search method (NAS) based physics-informed deep learning model is presented.
A three dimensional flow model is built to provide a benchmark to the simulation of groundwater flow in highly heterogeneous aquifers.
arXiv Detail & Related papers (2020-10-03T19:57:54Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.