Automated Model Discovery for Tensional Homeostasis: Constitutive Machine Learning in Growth and Remodeling
- URL: http://arxiv.org/abs/2410.13645v1
- Date: Thu, 17 Oct 2024 15:12:55 GMT
- Title: Automated Model Discovery for Tensional Homeostasis: Constitutive Machine Learning in Growth and Remodeling
- Authors: Hagen Holthusen, Tim Brepols, Kevin Linka, Ellen Kuhl,
- Abstract summary: We extend our inelastic Constitutive Artificial Neural Networks (iCANNs) by incorporating kinematic growth and homeostatic surfaces.
We evaluate the ability of the proposed network to learn from experimentally obtained tissue equivalent data at the material point level.
- Score: 0.0
- License:
- Abstract: Soft biological tissues exhibit a tendency to maintain a preferred state of tensile stress, known as tensional homeostasis, which is restored even after external mechanical stimuli. This macroscopic behavior can be described using the theory of kinematic growth, where the deformation gradient is multiplicatively decomposed into an elastic part and a part related to growth and remodeling. Recently, the concept of homeostatic surfaces was introduced to define the state of homeostasis and the evolution equations for inelastic deformations. However, identifying the optimal model and material parameters to accurately capture the macroscopic behavior of inelastic materials can only be accomplished with significant expertise, is often time-consuming, and prone to error, regardless of the specific inelastic phenomenon. To address this challenge, built-in physics machine learning algorithms offer significant potential. In this work, we extend our inelastic Constitutive Artificial Neural Networks (iCANNs) by incorporating kinematic growth and homeostatic surfaces to discover the scalar model equations, namely the Helmholtz free energy and the pseudo potential. The latter describes the state of homeostasis in a smeared sense. We evaluate the ability of the proposed network to learn from experimentally obtained tissue equivalent data at the material point level, assess its predictive accuracy beyond the training regime, and discuss its current limitations when applied at the structural level. Our source code, data, examples, and an implementation of the corresponding material subroutine are made accessible to the public at https://doi.org/10.5281/zenodo.13946282.
Related papers
- Accounting for plasticity: An extension of inelastic Constitutive Artificial Neural Networks [0.0]
We present the extension and application of an iCANN to the inelastic phenomena of plasticity.
We learn four feed-forward networks in combination with a recurrent neural network and use the second Piola-Kirchhoff stress measure for training.
We observe already satisfactory results for training on one load case only while extremely precise agreement is found for an increase in load cases.
arXiv Detail & Related papers (2024-07-27T19:19:42Z) - On the Dynamics Under the Unhinged Loss and Beyond [104.49565602940699]
We introduce the unhinged loss, a concise loss function, that offers more mathematical opportunities to analyze closed-form dynamics.
The unhinged loss allows for considering more practical techniques, such as time-vary learning rates and feature normalization.
arXiv Detail & Related papers (2023-12-13T02:11:07Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Data-driven anisotropic finite viscoelasticity using neural ordinary
differential equations [0.0]
We develop a fully data-driven model of anisotropic finite viscoelasticity using neural ordinary differential equations as building blocks.
We replace the Helmholtz free energy function and the dissipation potential with data-driven functions that satisfy physics-based constraints.
We train the model using stress-strain data from biological and synthetic materials including humain brain tissue, blood clots, natural rubber and human myocardium.
arXiv Detail & Related papers (2023-01-11T17:03:46Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Automatically Polyconvex Strain Energy Functions using Neural Ordinary
Differential Equations [0.0]
Deep neural networks are able to learn complex material without the constraints of form approximations.
N-ODE material model is able to capture synthetic data generated from closedform material models.
framework can be used to model a large class of materials.
arXiv Detail & Related papers (2021-10-03T13:11:43Z) - Thermodynamics-based Artificial Neural Networks (TANN) for multiscale
modeling of materials with inelastic microstructure [0.0]
Multiscale, homogenization approaches are often used for performing reliable, accurate predictions of the macroscopic mechanical behavior of inelastic materials.
Data-driven approaches based on deep learning have risen as a promising alternative to replace ad-hoc laws and speed-up numerical methods.
Here, we propose Thermodynamics-based Artificial Neural Networks (TANN) for the modeling of mechanical materials with inelastic and complex microstructure.
arXiv Detail & Related papers (2021-08-30T11:50:38Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Gradient Starvation: A Learning Proclivity in Neural Networks [97.02382916372594]
Gradient Starvation arises when cross-entropy loss is minimized by capturing only a subset of features relevant for the task.
This work provides a theoretical explanation for the emergence of such feature imbalance in neural networks.
arXiv Detail & Related papers (2020-11-18T18:52:08Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Geometric deep learning for computational mechanics Part I: Anisotropic
Hyperelasticity [1.8606313462183062]
This paper is the first attempt to use geometric deep learning and Sobolev training incorporate non-Euclidean microstructural data such that anisotropic hyperstructural material machine learning models can be trained in the finite deformation range.
arXiv Detail & Related papers (2020-01-08T02:07:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.