MD-inferred neural network monoclinic finite-strain hyperelasticity
models for $\beta$-HMX: Sobolev training and validation against physical
constraints
- URL: http://arxiv.org/abs/2112.02077v1
- Date: Mon, 29 Nov 2021 23:38:31 GMT
- Title: MD-inferred neural network monoclinic finite-strain hyperelasticity
models for $\beta$-HMX: Sobolev training and validation against physical
constraints
- Authors: Nikolaos N. Vlassis, Puhan Zhao, Ran Ma, Tommy Sewell, WaiChing Sun
- Abstract summary: We train and validate neural networks to predict the anisotropic elastic response of the monoclinic organic molecular crystal $beta$-HMX.
We compare the neural networks' training efficiency under different Sobolev constraints and assess the models' accuracy and robustness against MD benchmarks for $beta$-HMX.
- Score: 2.3816618027381438
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a machine learning framework to train and validate neural networks
to predict the anisotropic elastic response of the monoclinic organic molecular
crystal $\beta$-HMX in the geometrical nonlinear regime. A filtered molecular
dynamic (MD) simulations database is used to train the neural networks with a
Sobolev norm that uses the stress measure and a reference configuration to
deduce the elastic stored energy functional. To improve the accuracy of the
elasticity tangent predictions originating from the learned stored energy, a
transfer learning technique is used to introduce additional tangential
constraints from the data while necessary conditions (e.g. strong ellipticity,
crystallographic symmetry) for the correctness of the model are either
introduced as additional physical constraints or incorporated in the validation
tests. Assessment of the neural networks is based on (1) the accuracy with
which they reproduce the bottom-line constitutive responses predicted by MD,
(2) detailed examination of their stability and uniqueness, and (3)
admissibility of the predicted responses with respect to continuum mechanics
theory in the finite-deformation regime. We compare the neural networks'
training efficiency under different Sobolev constraints and assess the models'
accuracy and robustness against MD benchmarks for $\beta$-HMX.
Related papers
- EPi-cKANs: Elasto-Plasticity Informed Kolmogorov-Arnold Networks Using Chebyshev Polynomials [0.0]
We present an elasto-plasticity informed Chebyshev-based network (EPi-cKAN)
EPi-cKAN provides superior accuracy in predicting stress components and demonstrates better accuracy when used to predict sand elasto-plastic behavior under blind triaxial axisymmetric strain-controlled loading paths.
arXiv Detail & Related papers (2024-10-12T16:01:38Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - Kalman Filter for Online Classification of Non-Stationary Data [101.26838049872651]
In Online Continual Learning (OCL) a learning system receives a stream of data and sequentially performs prediction and training steps.
We introduce a probabilistic Bayesian online learning model by using a neural representation and a state space model over the linear predictor weights.
In experiments in multi-class classification we demonstrate the predictive ability of the model and its flexibility to capture non-stationarity.
arXiv Detail & Related papers (2023-06-14T11:41:42Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Neural network enhanced measurement efficiency for molecular
groundstates [63.36515347329037]
We adapt common neural network models to learn complex groundstate wavefunctions for several molecular qubit Hamiltonians.
We find that using a neural network model provides a robust improvement over using single-copy measurement outcomes alone to reconstruct observables.
arXiv Detail & Related papers (2022-06-30T17:45:05Z) - NN-EUCLID: deep-learning hyperelasticity without stress data [0.0]
We propose a new approach for unsupervised learning of hyperelastic laws with physics-consistent deep neural networks.
In contrast to supervised learning, which assumes the stress-strain, the approach only uses realistically measurable full-elastic field displacement and global force availability data.
arXiv Detail & Related papers (2022-05-04T13:54:54Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - On feedforward control using physics-guided neural networks: Training
cost regularization and optimized initialization [0.0]
Performance of model-based feedforward controllers is typically limited by the accuracy of the inverse system dynamics model.
This paper proposes a regularization method via identified physical parameters.
It is validated on a real-life industrial linear motor, where it delivers better tracking accuracy and extrapolation.
arXiv Detail & Related papers (2022-01-28T12:51:25Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Geometric deep learning for computational mechanics Part I: Anisotropic
Hyperelasticity [1.8606313462183062]
This paper is the first attempt to use geometric deep learning and Sobolev training incorporate non-Euclidean microstructural data such that anisotropic hyperstructural material machine learning models can be trained in the finite deformation range.
arXiv Detail & Related papers (2020-01-08T02:07:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.