Predicting and explaining nonlinear material response using deep
Physically Guided Neural Networks with Internal Variables
- URL: http://arxiv.org/abs/2308.03915v1
- Date: Mon, 7 Aug 2023 21:20:24 GMT
- Title: Predicting and explaining nonlinear material response using deep
Physically Guided Neural Networks with Internal Variables
- Authors: Javier Orera-Echeverria, Jacobo Ayensa-Jim\'enez, Manuel Doblare
- Abstract summary: We use the concept of Physically Guided Neural Networks with Internal Variables (PGNNIV) to discover laws.
PGNNIVs make a particular use of the physics of the problem to enforce constraints on specific hidden layers.
We demonstrate that PGNNIVs are capable of predicting both internal and external variables under unseen load scenarios.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Nonlinear materials are often difficult to model with classical state model
theory because they have a complex and sometimes inaccurate physical and
mathematical description or we simply do not know how to describe such
materials in terms of relations between external and internal variables. In
many disciplines, Neural Network methods have arisen as powerful tools to
identify very complex and non-linear correlations. In this work, we use the
very recently developed concept of Physically Guided Neural Networks with
Internal Variables (PGNNIV) to discover constitutive laws using a model-free
approach and training solely with measured force-displacement data. PGNNIVs
make a particular use of the physics of the problem to enforce constraints on
specific hidden layers and are able to make predictions without internal
variable data. We demonstrate that PGNNIVs are capable of predicting both
internal and external variables under unseen load scenarios, regardless of the
nature of the material considered (linear, with hardening or softening behavior
and hyperelastic), unravelling the constitutive law of the material hence
explaining its nature altogether, placing the method in what is known as
eXplainable Artificial Intelligence (XAI).
Related papers
- Learning Neural Constitutive Laws From Motion Observations for
Generalizable PDE Dynamics [97.38308257547186]
Many NN approaches learn an end-to-end model that implicitly models both the governing PDE and material models.
We argue that the governing PDEs are often well-known and should be explicitly enforced rather than learned.
We introduce a new framework termed "Neural Constitutive Laws" (NCLaw) which utilizes a network architecture that strictly guarantees standard priors.
arXiv Detail & Related papers (2023-04-27T17:42:24Z) - Learning solution of nonlinear constitutive material models using
physics-informed neural networks: COMM-PINN [0.0]
We apply physics-informed neural networks to solve the relations for nonlinear, path-dependent material behavior.
One advantage of this work is that it bypasses the repetitive Newton iterations needed to solve nonlinear equations in complex material models.
arXiv Detail & Related papers (2023-04-10T19:58:49Z) - Learning Physical Dynamics with Subequivariant Graph Neural Networks [99.41677381754678]
Graph Neural Networks (GNNs) have become a prevailing tool for learning physical dynamics.
Physical laws abide by symmetry, which is a vital inductive bias accounting for model generalization.
Our model achieves on average over 3% enhancement in contact prediction accuracy across 8 scenarios on Physion and 2X lower rollout MSE on RigidFall.
arXiv Detail & Related papers (2022-10-13T10:00:30Z) - Evolution TANN and the discovery of the internal variables and evolution
equations in solid mechanics [0.0]
We propose a new approach which allows, for the first time, to decouple the material representation from the incremental formulation.
Inspired by the Thermodynamics-based Artificial Neural Networks (TANN) and the theory of the internal variables, the evolution TANN (eTANN) are continuous-time.
Key feature of the proposed approach is the discovery of the evolution equations of the internal variables in the form of ordinary differential equations.
arXiv Detail & Related papers (2022-09-27T09:25:55Z) - Equivariant Graph Mechanics Networks with Constraints [83.38709956935095]
We propose Graph Mechanics Network (GMN) which is efficient, equivariant and constraint-aware.
GMN represents, by generalized coordinates, the forward kinematics information (positions and velocities) of a structural object.
Extensive experiments support the advantages of GMN compared to the state-of-the-art GNNs in terms of prediction accuracy, constraint satisfaction and data efficiency.
arXiv Detail & Related papers (2022-03-12T14:22:14Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Neural NID Rules [2.1030878979833467]
We introduce Neural NID, a method that learns abstract object properties and relations between objects with a suitably regularized graph neural network.
We validate the greater generalization capability of Neural NID on simple benchmarks specifically designed to assess the transition dynamics learned by the model.
arXiv Detail & Related papers (2022-02-12T10:47:06Z) - Meta-learning using privileged information for dynamics [66.32254395574994]
We extend the Neural ODE Process model to use additional information within the Learning Using Privileged Information setting.
We validate our extension with experiments showing improved accuracy and calibration on simulated dynamics tasks.
arXiv Detail & Related papers (2021-04-29T12:18:02Z) - On the application of Physically-Guided Neural Networks with Internal
Variables to Continuum Problems [0.0]
We present Physically-Guided Neural Networks with Internal Variables (PGNNIV)
universal physical laws are used as constraints in the neural network, in such a way that some neuron values can be interpreted as internal state variables of the system.
This endows the network with unraveling capacity, as well as better predictive properties such as faster convergence, fewer data needs and additional noise filtering.
We extend this new methodology to continuum physical problems, showing again its predictive and explanatory capacities when only using measurable values in the training set.
arXiv Detail & Related papers (2020-11-23T13:06:52Z) - Identification of state functions by physically-guided neural networks
with physically-meaningful internal layers [0.0]
We use the concept of physically-constrained neural networks (PCNN) to predict the input-output relation in a physical system.
We show that this approach, besides getting physically-based predictions, accelerates the training process.
arXiv Detail & Related papers (2020-11-17T11:26:37Z) - Developing Constrained Neural Units Over Time [81.19349325749037]
This paper focuses on an alternative way of defining Neural Networks, that is different from the majority of existing approaches.
The structure of the neural architecture is defined by means of a special class of constraints that are extended also to the interaction with data.
The proposed theory is cast into the time domain, in which data are presented to the network in an ordered manner.
arXiv Detail & Related papers (2020-09-01T09:07:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.