Evolution TANN and the discovery of the internal variables and evolution
equations in solid mechanics
- URL: http://arxiv.org/abs/2209.13269v1
- Date: Tue, 27 Sep 2022 09:25:55 GMT
- Title: Evolution TANN and the discovery of the internal variables and evolution
equations in solid mechanics
- Authors: Filippo Masi, Ioannis Stefanou
- Abstract summary: We propose a new approach which allows, for the first time, to decouple the material representation from the incremental formulation.
Inspired by the Thermodynamics-based Artificial Neural Networks (TANN) and the theory of the internal variables, the evolution TANN (eTANN) are continuous-time.
Key feature of the proposed approach is the discovery of the evolution equations of the internal variables in the form of ordinary differential equations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Data-driven and deep learning approaches have demonstrated to have the
potential of replacing classical constitutive models for complex materials,
displaying path-dependency and possessing multiple inherent scales. Yet, the
necessity of structuring constitutive models with an incremental formulation
has given rise to data-driven approaches where physical quantities, e.g.
deformation, blend with artificial, non-physical ones, such as the increments
in deformation and time. Neural networks and the consequent constitutive models
depend, thus, on the particular incremental formulation, fail in identifying
material representations locally in time, and suffer from poor generalization.
Here, we propose a new approach which allows, for the first time, to decouple
the material representation from the incremental formulation. Inspired by the
Thermodynamics-based Artificial Neural Networks (TANN) and the theory of the
internal variables, the evolution TANN (eTANN) are continuous-time, thus
independent of the aforementioned artificial quantities. Key feature of the
proposed approach is the discovery of the evolution equations of the internal
variables in the form of ordinary differential equations, rather than in an
incremental discrete-time form. In this work, we focus attention to juxtapose
and show how the various general notions of solid mechanics are implemented in
eTANN. The laws of thermodynamics are hardwired in the structure of the network
and allow predictions which are always consistent. We propose a methodology
that allows to discover, from data and first principles, admissible sets of
internal variables from the microscopic fields in complex materials. The
capabilities as well as the scalability of the proposed approach are
demonstrated through several applications involving a broad spectrum of complex
material behaviors, from plasticity to damage and viscosity.
Related papers
- Learning Divergence Fields for Shift-Robust Graph Representations [73.11818515795761]
In this work, we propose a geometric diffusion model with learnable divergence fields for the challenging problem with interdependent data.
We derive a new learning objective through causal inference, which can guide the model to learn generalizable patterns of interdependence that are insensitive across domains.
arXiv Detail & Related papers (2024-06-07T14:29:21Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Predicting and explaining nonlinear material response using deep
Physically Guided Neural Networks with Internal Variables [0.0]
We use the concept of Physically Guided Neural Networks with Internal Variables (PGNNIV) to discover laws.
PGNNIVs make a particular use of the physics of the problem to enforce constraints on specific hidden layers.
We demonstrate that PGNNIVs are capable of predicting both internal and external variables under unseen load scenarios.
arXiv Detail & Related papers (2023-08-07T21:20:24Z) - DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained
Diffusion [66.21290235237808]
We introduce an energy constrained diffusion model which encodes a batch of instances from a dataset into evolutionary states.
We provide rigorous theory that implies closed-form optimal estimates for the pairwise diffusion strength among arbitrary instance pairs.
Experiments highlight the wide applicability of our model as a general-purpose encoder backbone with superior performance in various tasks.
arXiv Detail & Related papers (2023-01-23T15:18:54Z) - Generalized Neural Closure Models with Interpretability [28.269731698116257]
We develop a novel and versatile methodology of unified neural partial delay differential equations.
We augment existing/low-fidelity dynamical models directly in their partial differential equation (PDE) forms with both Markovian and non-Markovian neural network (NN) closure parameterizations.
We demonstrate the new generalized neural closure models (gnCMs) framework using four sets of experiments based on advecting nonlinear waves, shocks, and ocean acidification models.
arXiv Detail & Related papers (2023-01-15T21:57:43Z) - Scientific Machine Learning for Modeling and Simulating Complex Fluids [0.0]
rheological equations relate internal stresses and deformations in complex fluids.
Data-driven models provide accessible alternatives to expensive first-principles models.
Development of similar models for complex fluids has lagged.
arXiv Detail & Related papers (2022-10-10T04:35:31Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - A deep learning driven pseudospectral PCE based FFT homogenization
algorithm for complex microstructures [68.8204255655161]
It is shown that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
It is shown, that the proposed method is able to predict central moments of interest while being magnitudes faster to evaluate than traditional approaches.
arXiv Detail & Related papers (2021-10-26T07:02:14Z) - Thermodynamics-based Artificial Neural Networks (TANN) for multiscale
modeling of materials with inelastic microstructure [0.0]
Multiscale, homogenization approaches are often used for performing reliable, accurate predictions of the macroscopic mechanical behavior of inelastic materials.
Data-driven approaches based on deep learning have risen as a promising alternative to replace ad-hoc laws and speed-up numerical methods.
Here, we propose Thermodynamics-based Artificial Neural Networks (TANN) for the modeling of mechanical materials with inelastic and complex microstructure.
arXiv Detail & Related papers (2021-08-30T11:50:38Z) - Augmenting Physical Models with Deep Networks for Complex Dynamics
Forecasting [34.61959169976758]
APHYNITY is a principled approach for augmenting incomplete physical dynamics described by differential equations with deep data-driven models.
It consists in decomposing the dynamics into two components: a physical component accounting for the dynamics for which we have some prior knowledge, and a data-driven component accounting for errors of the physical model.
arXiv Detail & Related papers (2020-10-09T09:31:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.