Geometric deep learning for computational mechanics Part I: Anisotropic
Hyperelasticity
- URL: http://arxiv.org/abs/2001.04292v1
- Date: Wed, 8 Jan 2020 02:07:39 GMT
- Title: Geometric deep learning for computational mechanics Part I: Anisotropic
Hyperelasticity
- Authors: Nikolaos Vlassis, Ran Ma, WaiChing Sun
- Abstract summary: This paper is the first attempt to use geometric deep learning and Sobolev training incorporate non-Euclidean microstructural data such that anisotropic hyperstructural material machine learning models can be trained in the finite deformation range.
- Score: 1.8606313462183062
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper is the first attempt to use geometric deep learning and Sobolev
training to incorporate non-Euclidean microstructural data such that
anisotropic hyperelastic material machine learning models can be trained in the
finite deformation range. While traditional hyperelasticity models often
incorporate homogenized measures of microstructural attributes, such as
porosity averaged orientation of constitutes, these measures cannot reflect the
topological structures of the attributes. We fill this knowledge gap by
introducing the concept of weighted graph as a new mean to store topological
information, such as the connectivity of anisotropic grains in assembles. Then,
by leveraging a graph convolutional deep neural network architecture in the
spectral domain, we introduce a mechanism to incorporate these non-Euclidean
weighted graph data directly as input for training and for predicting the
elastic responses of materials with complex microstructures. To ensure
smoothness and prevent non-convexity of the trained stored energy functional,
we introduce a Sobolev training technique for neural networks such that stress
measure is obtained implicitly from taking directional derivatives of the
trained energy functional. By optimizing the neural network to approximate both
the energy functional output and the stress measure, we introduce a training
procedure the improves efficiency and generalize the learned energy functional
for different microstructures. The trained hybrid neural network model is then
used to generate new stored energy functional for unseen microstructures in a
parametric study to predict the influence of elastic anisotropy on the
nucleation and propagation of fracture in the brittle regime.
Related papers
- Symmetry-enforcing neural networks with applications to constitutive modeling [0.0]
We show how to combine state-of-the-art micromechanical modeling and advanced machine learning techniques to homogenize complex microstructures exhibiting non-linear and history dependent behaviors.
The resulting homogenized model, termed smart law (SCL), enables the adoption of microly informed laws into finite element solvers at a fraction of the computational cost required by traditional concurrent multiscale approaches.
In this work, the capabilities of SCLs are expanded via the introduction of a novel methodology that enforces material symmetries at the neuron level.
arXiv Detail & Related papers (2023-12-21T01:12:44Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - NN-EUCLID: deep-learning hyperelasticity without stress data [0.0]
We propose a new approach for unsupervised learning of hyperelastic laws with physics-consistent deep neural networks.
In contrast to supervised learning, which assumes the stress-strain, the approach only uses realistically measurable full-elastic field displacement and global force availability data.
arXiv Detail & Related papers (2022-05-04T13:54:54Z) - MD-inferred neural network monoclinic finite-strain hyperelasticity
models for $\beta$-HMX: Sobolev training and validation against physical
constraints [2.3816618027381438]
We train and validate neural networks to predict the anisotropic elastic response of the monoclinic organic molecular crystal $beta$-HMX.
We compare the neural networks' training efficiency under different Sobolev constraints and assess the models' accuracy and robustness against MD benchmarks for $beta$-HMX.
arXiv Detail & Related papers (2021-11-29T23:38:31Z) - A deep learning driven pseudospectral PCE based FFT homogenization
algorithm for complex microstructures [68.8204255655161]
It is shown that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
It is shown, that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
arXiv Detail & Related papers (2021-10-26T07:02:14Z) - Physics informed neural networks for continuum micromechanics [68.8204255655161]
Recently, physics informed neural networks have successfully been applied to a broad variety of problems in applied mathematics and engineering.
Due to the global approximation, physics informed neural networks have difficulties in displaying localized effects and strong non-linear solutions by optimization.
It is shown, that the domain decomposition approach is able to accurately resolve nonlinear stress, displacement and energy fields in heterogeneous microstructures obtained from real-world $mu$CT-scans.
arXiv Detail & Related papers (2021-10-14T14:05:19Z) - Thermodynamics-based Artificial Neural Networks (TANN) for multiscale
modeling of materials with inelastic microstructure [0.0]
Multiscale, homogenization approaches are often used for performing reliable, accurate predictions of the macroscopic mechanical behavior of inelastic materials.
Data-driven approaches based on deep learning have risen as a promising alternative to replace ad-hoc laws and speed-up numerical methods.
Here, we propose Thermodynamics-based Artificial Neural Networks (TANN) for the modeling of mechanical materials with inelastic and complex microstructure.
arXiv Detail & Related papers (2021-08-30T11:50:38Z) - Gradient Starvation: A Learning Proclivity in Neural Networks [97.02382916372594]
Gradient Starvation arises when cross-entropy loss is minimized by capturing only a subset of features relevant for the task.
This work provides a theoretical explanation for the emergence of such feature imbalance in neural networks.
arXiv Detail & Related papers (2020-11-18T18:52:08Z) - Hyperbolic Neural Networks++ [66.16106727715061]
We generalize the fundamental components of neural networks in a single hyperbolic geometry model, namely, the Poincar'e ball model.
Experiments show the superior parameter efficiency of our methods compared to conventional hyperbolic components, and stability and outperformance over their Euclidean counterparts.
arXiv Detail & Related papers (2020-06-15T08:23:20Z) - Understanding Generalization in Deep Learning via Tensor Methods [53.808840694241]
We advance the understanding of the relations between the network's architecture and its generalizability from the compression perspective.
We propose a series of intuitive, data-dependent and easily-measurable properties that tightly characterize the compressibility and generalizability of neural networks.
arXiv Detail & Related papers (2020-01-14T22:26:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.