A physics-informed deep neural network for surrogate modeling in
classical elasto-plasticity
- URL: http://arxiv.org/abs/2204.12088v1
- Date: Tue, 26 Apr 2022 05:58:13 GMT
- Title: A physics-informed deep neural network for surrogate modeling in
classical elasto-plasticity
- Authors: Mahdad Eghbalian, Mehdi Pouragha, Richard Wan
- Abstract summary: We present a deep neural network architecture that can efficiently approximate classical elasto-plastic relations.
The network is enriched with crucial physics aspects of classical elasto-plasticity, including additive decomposition of strains into elastic and plastic parts.
We show that embedding these physics into the architecture of the neural network facilitates a more efficient training of the network with less training data.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, we present a deep neural network architecture that can
efficiently approximate classical elasto-plastic constitutive relations. The
network is enriched with crucial physics aspects of classical
elasto-plasticity, including additive decomposition of strains into elastic and
plastic parts, and nonlinear incremental elasticity. This leads to a
Physics-Informed Neural Network (PINN) surrogate model named here as
Elasto-Plastic Neural Network (EPNN). Detailed analyses show that embedding
these physics into the architecture of the neural network facilitates a more
efficient training of the network with less training data, while also enhancing
the extrapolation capability for loading regimes outside the training data. The
architecture of EPNN is model and material-independent, i.e. it can be adapted
to a wide range of elasto-plastic material types, including geomaterials and
metals; and experimental data can potentially be directly used in training the
network. To demonstrate the robustness of the proposed architecture, we adapt
its general framework to the elasto-plastic behavior of sands. We use synthetic
data generated from material point simulations based on a relatively advanced
dilatancy-based constitutive model for granular materials to train the neural
network. The superiority of EPNN over regular neural network architectures is
explored through predicting unseen strain-controlled loading paths for sands
with different initial densities.
Related papers
- Accounting for plasticity: An extension of inelastic Constitutive Artificial Neural Networks [0.0]
We present the extension and application of an iCANN to the inelastic phenomena of plasticity.
We learn four feed-forward networks in combination with a recurrent neural network and use the second Piola-Kirchhoff stress measure for training.
We observe already satisfactory results for training on one load case only while extremely precise agreement is found for an increase in load cases.
arXiv Detail & Related papers (2024-07-27T19:19:42Z) - ElastoGen: 4D Generative Elastodynamics [59.20029207991106]
ElastoGen is a knowledge-driven AI model that generates physically accurate 4D elastodynamics.
Because of its alignment with actual physical procedures, ElastoGen efficiently generates accurate dynamics for a wide range of hyperelastic materials.
arXiv Detail & Related papers (2024-05-23T21:09:36Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - A new family of Constitutive Artificial Neural Networks towards
automated model discovery [0.0]
Neural Networks are powerful approximators that can learn function relations from large data without any knowledge of the underlying physics.
We show that Constive Neural Networks have potential paradigm shift in user-defined model selection to automated model discovery.
arXiv Detail & Related papers (2022-09-15T18:33:37Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z) - Hyperbolic Neural Networks++ [66.16106727715061]
We generalize the fundamental components of neural networks in a single hyperbolic geometry model, namely, the Poincar'e ball model.
Experiments show the superior parameter efficiency of our methods compared to conventional hyperbolic components, and stability and outperformance over their Euclidean counterparts.
arXiv Detail & Related papers (2020-06-15T08:23:20Z) - Thermodynamics-based Artificial Neural Networks for constitutive
modeling [0.0]
We propose a new class of data-driven, physics-based, neural networks for modeling of strain rate independent processes at the material point level.
The two basic principles of thermodynamics are encoded in the network's architecture by taking advantage of automatic differentiation.
We demonstrate the wide applicability of TANNs for modeling elasto-plastic materials, with strain hardening and softening strain.
arXiv Detail & Related papers (2020-05-25T15:56:34Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - A deep learning framework for solution and discovery in solid mechanics [1.4699455652461721]
We present the application of a class of deep learning, known as Physics Informed Neural Networks (PINN), to learning and discovery in solid mechanics.
We explain how to incorporate the momentum balance and elasticity relations into PINN, and explore in detail the application to linear elasticity.
arXiv Detail & Related papers (2020-02-14T08:24:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.