Deep Convolutional Neural Networks Predict Elasticity Tensors and their
Bounds in Homogenization
- URL: http://arxiv.org/abs/2109.03020v1
- Date: Sat, 4 Sep 2021 15:46:12 GMT
- Title: Deep Convolutional Neural Networks Predict Elasticity Tensors and their
Bounds in Homogenization
- Authors: Bernhard Eidel
- Abstract summary: 3D convolutional neural networks (CNNs) are trained to link random heterogeneous, two-phase materials to their elastic macroscale stiffness.
CNNs demonstrate the predictive accuracy not only for the standard test set but also for samples of the real, two-phase microstructure of a diamond-based coating.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In the present work, 3D convolutional neural networks (CNNs) are trained to
link random heterogeneous, two-phase materials of arbitrary phase fractions to
their elastic macroscale stiffness thus replacing explicit homogenization
simulations. In order to reduce the uncertainty of the true stiffness of the
synthetic composites due to unknown boundary conditions (BCs), the CNNs predict
beyond the stiffness for periodic BC the upper bound through kinematically
uniform BC, and the lower bound through stress uniform BC. This work describes
the workflow of the homogenization-CNN, from microstructure generation over the
CNN design, the operations of convolution, nonlinear activation and pooling as
well as training and validation along with backpropagation up to performance
measurements in tests. Therein the CNNs demonstrate the predictive accuracy not
only for the standard test set but also for samples of the real, two-phase
microstructure of a diamond-based coating. The CNN that covers all three
boundary types is virtually as accurate as the separate treatment in three
different nets. The CNNs of this contribution provide through stiffness bounds
an indicator of the proper RVE size for individual snapshot samples. Moreover,
they enable statistical analyses for the effective elastic stiffness on
ensembles of synthetical microstructures without costly simulations.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Towards Continual Learning Desiderata via HSIC-Bottleneck
Orthogonalization and Equiangular Embedding [55.107555305760954]
We propose a conceptually simple yet effective method that attributes forgetting to layer-wise parameter overwriting and the resulting decision boundary distortion.
Our method achieves competitive accuracy performance, even with absolute superiority of zero exemplar buffer and 1.02x the base model.
arXiv Detail & Related papers (2024-01-17T09:01:29Z) - A Neural Network Transformer Model for Composite Microstructure Homogenization [1.2277343096128712]
Homogenization methods, such as the Mori-Tanaka method, offer rapid homogenization for a wide range of constituent properties.
This paper illustrates a transformer neural network architecture that captures the knowledge of various microstructures.
The network predicts the history-dependent, non-linear, and homogenized stress-strain response.
arXiv Detail & Related papers (2023-04-16T19:57:52Z) - Three-dimensional microstructure generation using generative adversarial
neural networks in the context of continuum micromechanics [77.34726150561087]
This work proposes a generative adversarial network tailored towards three-dimensional microstructure generation.
The lightweight algorithm is able to learn the underlying properties of the material from a single microCT-scan without the need of explicit descriptors.
arXiv Detail & Related papers (2022-05-31T13:26:51Z) - MD-inferred neural network monoclinic finite-strain hyperelasticity
models for $\beta$-HMX: Sobolev training and validation against physical
constraints [2.3816618027381438]
We train and validate neural networks to predict the anisotropic elastic response of the monoclinic organic molecular crystal $beta$-HMX.
We compare the neural networks' training efficiency under different Sobolev constraints and assess the models' accuracy and robustness against MD benchmarks for $beta$-HMX.
arXiv Detail & Related papers (2021-11-29T23:38:31Z) - A deep learning driven pseudospectral PCE based FFT homogenization
algorithm for complex microstructures [68.8204255655161]
It is shown that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
It is shown, that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
arXiv Detail & Related papers (2021-10-26T07:02:14Z) - Mesh convolutional neural networks for wall shear stress estimation in
3D artery models [7.7393800633675465]
We propose to use mesh convolutional neural networks that directly operate on the same finite-element surface mesh as used in CFD.
We show that our flexible deep learning model can accurately predict 3D wall shear stress vectors on this surface mesh.
arXiv Detail & Related papers (2021-09-10T11:32:05Z) - Estimating permeability of 3D micro-CT images by physics-informed CNNs
based on DNS [1.6274397329511197]
This paper presents a novel methodology for permeability prediction from micro-CT scans of geological rock samples.
The training data set for CNNs dedicated to permeability prediction consists of permeability labels that are typically generated by classical lattice Boltzmann methods (LBM)
We instead perform direct numerical simulation (DNS) by solving the stationary Stokes equation in an efficient and distributed-parallel manner.
arXiv Detail & Related papers (2021-09-04T08:43:19Z) - Robust Implicit Networks via Non-Euclidean Contractions [63.91638306025768]
Implicit neural networks show improved accuracy and significant reduction in memory consumption.
They can suffer from ill-posedness and convergence instability.
This paper provides a new framework to design well-posed and robust implicit neural networks.
arXiv Detail & Related papers (2021-06-06T18:05:02Z) - ACDC: Weight Sharing in Atom-Coefficient Decomposed Convolution [57.635467829558664]
We introduce a structural regularization across convolutional kernels in a CNN.
We show that CNNs now maintain performance with dramatic reduction in parameters and computations.
arXiv Detail & Related papers (2020-09-04T20:41:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.