Recurrent neural networks and transfer learning for elasto-plasticity in
woven composites
- URL: http://arxiv.org/abs/2311.13434v2
- Date: Thu, 7 Dec 2023 14:59:02 GMT
- Title: Recurrent neural networks and transfer learning for elasto-plasticity in
woven composites
- Authors: Ehsan Ghane, Martin Fagerstr\"om, and Mohsen Mirkhalaf
- Abstract summary: This article presents Recurrent Neural Network (RNN) models as a surrogate for computationally intensive meso-scale simulation of woven composites.
A mean-field model generates a comprehensive data set representing elasto-plastic behavior.
In simulations, arbitrary six-dimensional strain histories are used to predict stresses under random walking as the source task and cyclic loading conditions as the target task.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As a surrogate for computationally intensive meso-scale simulation of woven
composites, this article presents Recurrent Neural Network (RNN) models.
Leveraging the power of transfer learning, the initialization challenges and
sparse data issues inherent in cyclic shear strain loads are addressed in the
RNN models. A mean-field model generates a comprehensive data set representing
elasto-plastic behavior. In simulations, arbitrary six-dimensional strain
histories are used to predict stresses under random walking as the source task
and cyclic loading conditions as the target task. Incorporating sub-scale
properties enhances RNN versatility. In order to achieve accurate predictions,
the model uses a grid search method to tune network architecture and
hyper-parameter configurations. The results of this study demonstrate that
transfer learning can be used to effectively adapt the RNN to varying strain
conditions, which establishes its potential as a useful tool for modeling
path-dependent responses in woven composites.
Related papers
- Physically recurrent neural network for rate and path-dependent heterogeneous materials in a finite strain framework [0.0]
A hybrid physics-based data-driven surrogate model for the microscale analysis of heterogeneous material is investigated.
The proposed model benefits from the physics-based knowledge contained in the models used in the full-order micromodel by embedding them in a neural network.
arXiv Detail & Related papers (2024-04-05T12:40:03Z) - Iterative self-transfer learning: A general methodology for response
time-history prediction based on small dataset [0.0]
An iterative self-transfer learningmethod for training neural networks based on small datasets is proposed in this study.
The results show that the proposed method can improve the model performance by near an order of magnitude on small datasets.
arXiv Detail & Related papers (2023-06-14T18:48:04Z) - Thermodynamically Consistent Machine-Learned Internal State Variable
Approach for Data-Driven Modeling of Path-Dependent Materials [0.76146285961466]
Data-driven machine learning models, such as deep neural networks and recurrent neural networks (RNNs), have become viable alternatives.
This study proposes a machine-learned data robustness-driven modeling approach for path-dependent materials based on the measurable material.
arXiv Detail & Related papers (2022-05-01T23:25:08Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - CRNNTL: convolutional recurrent neural network and transfer learning for
QSAR modelling [4.090810719630087]
We propose the convolutional recurrent neural network and transfer learning (CRNNTL) for QSAR modelling.
Our strategy takes advantages of both convolutional and recurrent neural networks for feature extraction, as well as the data augmentation method.
arXiv Detail & Related papers (2021-09-07T20:04:55Z) - Regularized Sequential Latent Variable Models with Adversarial Neural
Networks [33.74611654607262]
We will present different ways of using high level latent random variables in RNN to model the variability in the sequential data.
We will explore possible ways of using adversarial method to train a variational RNN model.
arXiv Detail & Related papers (2021-08-10T08:05:14Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - A Fully Tensorized Recurrent Neural Network [48.50376453324581]
We introduce a "fully tensorized" RNN architecture which jointly encodes the separate weight matrices within each recurrent cell.
This approach reduces model size by several orders of magnitude, while still maintaining similar or better performance compared to standard RNNs.
arXiv Detail & Related papers (2020-10-08T18:24:12Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.