Automatically Polyconvex Strain Energy Functions using Neural Ordinary
Differential Equations
- URL: http://arxiv.org/abs/2110.03774v1
- Date: Sun, 3 Oct 2021 13:11:43 GMT
- Title: Automatically Polyconvex Strain Energy Functions using Neural Ordinary
Differential Equations
- Authors: Vahidullah Tac, Francisco S. Costabal, Adrian Buganza Tepole
- Abstract summary: Deep neural networks are able to learn complex material without the constraints of form approximations.
N-ODE material model is able to capture synthetic data generated from closedform material models.
framework can be used to model a large class of materials.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Data-driven methods are becoming an essential part of computational mechanics
due to their unique advantages over traditional material modeling. Deep neural
networks are able to learn complex material response without the constraints of
closed-form approximations. However, imposing the physics-based mathematical
requirements that any material model must comply with is not straightforward
for data-driven approaches. In this study, we use a novel class of neural
networks, known as neural ordinary differential equations (N-ODEs), to develop
data-driven material models that automatically satisfy polyconvexity of the
strain energy function with respect to the deformation gradient, a condition
needed for the existence of minimizers for boundary value problems in
elasticity. We take advantage of the properties of ordinary differential
equations to create monotonic functions that approximate the derivatives of the
strain energy function with respect to the invariants of the right Cauchy-Green
deformation tensor. The monotonicity of the derivatives guarantees the
convexity of the energy. The N-ODE material model is able to capture synthetic
data generated from closed-form material models, and it outperforms
conventional models when tested against experimental data on skin, a highly
nonlinear and anisotropic material. We also showcase the use of the N-ODE
material model in finite element simulations. The framework is general and can
be used to model a large class of materials. Here we focus on hyperelasticity,
but polyconvex strain energies are a core building block for other problems in
elasticity such as viscous and plastic deformations. We therefore expect our
methodology to further enable data-driven methods in computational mechanics
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces a novel family of deep dynamical models designed to represent continuous-time sequence data.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experiments on oscillating systems, videos and real-world state sequences (MuJoCo) illustrate that ODEs with the learnable energy-based prior outperform existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - On the Trade-off Between Efficiency and Precision of Neural Abstraction [62.046646433536104]
Neural abstractions have been recently introduced as formal approximations of complex, nonlinear dynamical models.
We employ formal inductive synthesis procedures to generate neural abstractions that result in dynamical models with these semantics.
arXiv Detail & Related papers (2023-07-28T13:22:32Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Data-driven anisotropic finite viscoelasticity using neural ordinary
differential equations [0.0]
We develop a fully data-driven model of anisotropic finite viscoelasticity using neural ordinary differential equations as building blocks.
We replace the Helmholtz free energy function and the dissipation potential with data-driven functions that satisfy physics-based constraints.
We train the model using stress-strain data from biological and synthetic materials including humain brain tissue, blood clots, natural rubber and human myocardium.
arXiv Detail & Related papers (2023-01-11T17:03:46Z) - Calibrating constitutive models with full-field data via physics
informed neural networks [0.0]
We propose a physics-informed deep-learning framework for the discovery of model parameterizations given full-field displacement data.
We work with the weak form of the governing equations rather than the strong form to impose physical constraints upon the neural network predictions.
We demonstrate that informed machine learning is an enabling technology and may shift the paradigm of how full-field experimental data is utilized to calibrate models.
arXiv Detail & Related papers (2022-03-30T18:07:44Z) - Learning Deep Implicit Fourier Neural Operators (IFNOs) with
Applications to Heterogeneous Material Modeling [3.9181541460605116]
We propose to use data-driven modeling to predict a material's response without using conventional models.
The material response is modeled by learning the implicit mappings between loading conditions and the resultant displacement and/or damage fields.
We demonstrate the performance of our proposed method for a number of examples, including hyperelastic, anisotropic and brittle materials.
arXiv Detail & Related papers (2022-03-15T19:08:13Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - Polyconvex anisotropic hyperelasticity with neural networks [1.7616042687330642]
convex machine learning based models for finite deformations are proposed.
The models are calibrated with highly challenging simulation data of cubic lattice metamaterials.
The data for the data approach is based on mechanical considerations and does not require additional experimental or simulation capabilities.
arXiv Detail & Related papers (2021-06-20T15:33:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.