Sobolev training of thermodynamic-informed neural networks for smoothed
elasto-plasticity models with level set hardening
- URL: http://arxiv.org/abs/2010.11265v1
- Date: Thu, 15 Oct 2020 22:43:32 GMT
- Title: Sobolev training of thermodynamic-informed neural networks for smoothed
elasto-plasticity models with level set hardening
- Authors: Nikolaos N. Vlassis and WaiChing Sun
- Abstract summary: We introduce a deep learning framework designed to train smoothed elastoplasticity models with interpretable components.
By recasting the yield function as an evolving level set, we introduce a machine learning approach to predict the solutions of the Hamilton-Jacobi equation.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a deep learning framework designed to train smoothed
elastoplasticity models with interpretable components, such as a smoothed
stored elastic energy function, a yield surface, and a plastic flow that are
evolved based on a set of deep neural network predictions. By recasting the
yield function as an evolving level set, we introduce a machine learning
approach to predict the solutions of the Hamilton-Jacobi equation that governs
the hardening mechanism. This machine learning hardening law may recover
classical hardening models and discover new mechanisms that are otherwise very
difficult to anticipate and hand-craft. This treatment enables us to use
supervised machine learning to generate models that are thermodynamically
consistent, interpretable, but also exhibit excellent learning capacity. Using
a 3D FFT solver to create a polycrystal database, numerical experiments are
conducted and the implementations of each component of the models are
individually verified. Our numerical experiments reveal that this new approach
provides more robust and accurate forward predictions of cyclic stress paths
than these obtained from black-box deep neural network models such as a
recurrent GRU neural network, a 1D convolutional neural network, and a
multi-step feedforward model.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Neural Residual Diffusion Models for Deep Scalable Vision Generation [17.931568104324985]
We propose a unified and massively scalable Neural Residual Diffusion Models framework (Neural-RDM)
The proposed neural residual models obtain state-of-the-art scores on image's and video's generative benchmarks.
arXiv Detail & Related papers (2024-06-19T04:57:18Z) - Data-driven low-dimensional model of a sedimenting flexible fiber [0.0]
This work describes a data-driven technique to create high-fidelity low-dimensional models of flexible fiber dynamics using machine learning.
The approach combines an autoencoder neural network architecture to learn a low-dimensional latent representation of the filament shape.
We show that our data-driven model can accurately forecast the evolution of a fiber at both trained and untrained elasto-gravitational numbers.
arXiv Detail & Related papers (2024-05-16T21:07:09Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Physics-Informed Neural Networks with Hard Linear Equality Constraints [9.101849365688905]
This work proposes a novel physics-informed neural network, KKT-hPINN, which rigorously guarantees hard linear equality constraints.
Experiments on Aspen models of a stirred-tank reactor unit, an extractive distillation subsystem, and a chemical plant demonstrate that this model can further enhance the prediction accuracy.
arXiv Detail & Related papers (2024-02-11T17:40:26Z) - Extreme sparsification of physics-augmented neural networks for
interpretable model discovery in mechanics [0.0]
We propose to train regularized physics-augmented neural network-based models utilizing a smoothed version of $L0$-regularization.
We show that the method can reliably obtain interpretable and trustworthy models for compressible and incompressible thermodynamicity, yield functions, and hardening models for elastoplasticity.
arXiv Detail & Related papers (2023-10-05T16:28:58Z) - On the Trade-off Between Efficiency and Precision of Neural Abstraction [62.046646433536104]
Neural abstractions have been recently introduced as formal approximations of complex, nonlinear dynamical models.
We employ formal inductive synthesis procedures to generate neural abstractions that result in dynamical models with these semantics.
arXiv Detail & Related papers (2023-07-28T13:22:32Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - On Energy-Based Models with Overparametrized Shallow Neural Networks [44.74000986284978]
Energy-based models (EBMs) are a powerful framework for generative modeling.
In this work we focus on shallow neural networks.
We show that models trained in the so-called "active" regime provide a statistical advantage over their associated "lazy" or kernel regime.
arXiv Detail & Related papers (2021-04-15T15:34:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.