Accounting for plasticity: An extension of inelastic Constitutive Artificial Neural Networks
- URL: http://arxiv.org/abs/2407.19326v2
- Date: Sat, 07 Jun 2025 06:08:04 GMT
- Title: Accounting for plasticity: An extension of inelastic Constitutive Artificial Neural Networks
- Authors: Birte Boes, Jaan-Willem Simon, Hagen Holthusen,
- Abstract summary: We extend the existing framework of inelastic artificial neural networks (iCANNs) by incorporating plasticity to increase their applicability to model more complex material behavior.<n>Our framework captures both linear and nonlinear kinematic hardening behavior.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this work, we extend the existing framework of inelastic constitutive artificial neural networks (iCANNs) by incorporating plasticity to increase their applicability to model more complex material behavior. The proposed approach ensures objectivity, material symmetry, and thermodynamic consistency, providing a robust basis for automatic model discovery of constitutive equations at finite strains. These are predicted by discovering formulations for the Helmholtz free energy and plastic potentials for the yield function and evolution equations in terms of feed-forward networks. Our framework captures both linear and nonlinear kinematic hardening behavior. Investigation of our model's prediction showed that the extended iCANNs successfully predict both linear and nonlinear kinematic hardening behavior based on experimental and artificially generated datasets, showcasing promising capabilities of this framework. Nonetheless, challenges remain in discovering more complex yield criteria with tension-compression asymmetry and addressing deviations in experimental data at larger strains. Despite these limitations, the proposed framework provides a promising basis for incorporating plasticity into iCANNs, offering a platform for advancing in the field of automated model discovery.
Related papers
- UniGenX: Unified Generation of Sequence and Structure with Autoregressive Diffusion [61.690978792873196]
Existing approaches rely on either autoregressive sequence models or diffusion models.<n>We propose UniGenX, a unified framework that combines autoregressive next-token prediction with conditional diffusion models.<n>We validate the effectiveness of UniGenX on material and small molecule generation tasks.
arXiv Detail & Related papers (2025-03-09T16:43:07Z) - A generalized dual potential for inelastic Constitutive Artificial Neural Networks: A JAX implementation at finite strains [0.0]
We present a methodology for designing a generalized dual potential, or pseudo potential, for inelastic Constitutive Artificial Neural Networks (iCANNs)
This potential inherently satisfies thermodynamic consistency for large deformations.
Our results indicate that the novel architecture robustly discovers interpretable models and parameters, while autonomously revealing the degree of inelasticity.
arXiv Detail & Related papers (2025-02-19T20:16:45Z) - Generalized Factor Neural Network Model for High-dimensional Regression [50.554377879576066]
We tackle the challenges of modeling high-dimensional data sets with latent low-dimensional structures hidden within complex, non-linear, and noisy relationships.<n>Our approach enables a seamless integration of concepts from non-parametric regression, factor models, and neural networks for high-dimensional regression.
arXiv Detail & Related papers (2025-02-16T23:13:55Z) - Training-Free Constrained Generation With Stable Diffusion Models [45.138721047543214]
Stable diffusion models represent the state-of-the-art in data synthesis across diverse domains.<n>This paper proposes a novel integration of stable diffusion models with constrained optimization frameworks.<n>The effectiveness of this approach is demonstrated through material design experiments requiring adherence to precise morphometric properties.
arXiv Detail & Related papers (2025-02-08T16:11:17Z) - Automated Model Discovery for Tensional Homeostasis: Constitutive Machine Learning in Growth and Remodeling [0.0]
We extend our inelastic Constitutive Artificial Neural Networks (iCANNs) by incorporating kinematic growth and homeostatic surfaces.
We evaluate the ability of the proposed network to learn from experimentally obtained tissue equivalent data at the material point level.
arXiv Detail & Related papers (2024-10-17T15:12:55Z) - Theory and implementation of inelastic Constitutive Artificial Neural
Networks [0.0]
We extend the Constitutive Artificial Neural Networks (CANNs) to inelastic materials (iCANN)
We demonstrate that the iCANN is capable of autonomously discovering models for artificially generated data.
Our vision is that the iCANN will reveal to us new ways to find the various inelastic phenomena hidden in the data and to understand their interaction.
arXiv Detail & Related papers (2023-11-10T20:13:29Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Extreme sparsification of physics-augmented neural networks for
interpretable model discovery in mechanics [0.0]
We propose to train regularized physics-augmented neural network-based models utilizing a smoothed version of $L0$-regularization.
We show that the method can reliably obtain interpretable and trustworthy models for compressible and incompressible thermodynamicity, yield functions, and hardening models for elastoplasticity.
arXiv Detail & Related papers (2023-10-05T16:28:58Z) - PLASTIC: Improving Input and Label Plasticity for Sample Efficient
Reinforcement Learning [54.409634256153154]
In Reinforcement Learning (RL), enhancing sample efficiency is crucial.
In principle, off-policy RL algorithms can improve sample efficiency by allowing multiple updates per environment interaction.
Our study investigates the underlying causes of this phenomenon by dividing plasticity into two aspects.
arXiv Detail & Related papers (2023-06-19T06:14:51Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Viscoelastic Constitutive Artificial Neural Networks (vCANNs) $-$ a
framework for data-driven anisotropic nonlinear finite viscoelasticity [0.0]
We introduce viscoelastic Constitutive Artificial Neural Networks (vCANNs)
vCANNs are a novel physics-informed machine learning framework for anisotropic nonlinear viscoity at finite strains.
We demonstrate that vCANNs can learn to capture the behavior of all these materials accurately and efficiently without human guidance.
arXiv Detail & Related papers (2023-03-21T19:45:59Z) - Neural Abstractions [72.42530499990028]
We present a novel method for the safety verification of nonlinear dynamical models that uses neural networks to represent abstractions of their dynamics.
We demonstrate that our approach performs comparably to the mature tool Flow* on existing benchmark nonlinear models.
arXiv Detail & Related papers (2023-01-27T12:38:09Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - A physics-informed deep neural network for surrogate modeling in
classical elasto-plasticity [0.0]
We present a deep neural network architecture that can efficiently approximate classical elasto-plastic relations.
The network is enriched with crucial physics aspects of classical elasto-plasticity, including additive decomposition of strains into elastic and plastic parts.
We show that embedding these physics into the architecture of the neural network facilitates a more efficient training of the network with less training data.
arXiv Detail & Related papers (2022-04-26T05:58:13Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Physics-informed neural networks for modeling rate- and
temperature-dependent plasticity [3.1861308132183384]
This work presents a physics-informed neural network based framework to model the strain-rate and temperature dependence of the deformation fields in elastic-viscoplastic solids.
arXiv Detail & Related papers (2022-01-20T18:49:27Z) - Sobolev training of thermodynamic-informed neural networks for smoothed
elasto-plasticity models with level set hardening [0.0]
We introduce a deep learning framework designed to train smoothed elastoplasticity models with interpretable components.
By recasting the yield function as an evolving level set, we introduce a machine learning approach to predict the solutions of the Hamilton-Jacobi equation.
arXiv Detail & Related papers (2020-10-15T22:43:32Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.