Predicting Grain Growth in Polycrystalline Materials Using Deep Learning Time Series Models
- URL: http://arxiv.org/abs/2511.11630v1
- Date: Fri, 07 Nov 2025 18:29:42 GMT
- Title: Predicting Grain Growth in Polycrystalline Materials Using Deep Learning Time Series Models
- Authors: Eliane Younes, Elie Hachem, Marc Bernacki,
- Abstract summary: Grain Growth strongly influences the mechanical behavior of materials, making its prediction a key objective in microstructural engineering.<n>In this study, several deep learning approaches were evaluated, including recurrent neural networks (RNN), long short-term memory (LSTM), temporal convolutional networks (TCN), and transformers.<n>The LSTM network achieved the highest accuracy (above 90%) and the most stable performance, maintaining physically consistent predictions over extended horizons.
- Score: 0.9558392439655014
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Grain Growth strongly influences the mechanical behavior of materials, making its prediction a key objective in microstructural engineering. In this study, several deep learning approaches were evaluated, including recurrent neural networks (RNN), long short-term memory (LSTM), temporal convolutional networks (TCN), and transformers, to forecast grain size distributions during grain growth. Unlike full-field simulations, which are computationally demanding, the present work relies on mean-field statistical descriptors extracted from high-fidelity simulations. A dataset of 120 grain growth sequences was processed into normalized grain size distributions as a function of time. The models were trained to predict future distributions from a short temporal history using a recursive forecasting strategy. Among the tested models, the LSTM network achieved the highest accuracy (above 90\%) and the most stable performance, maintaining physically consistent predictions over extended horizons while reducing computation time from about 20 minutes per sequence to only a few seconds, whereas the other architectures tended to diverge when forecasting further in time. These results highlight the potential of low-dimensional descriptors and LSTM-based forecasting for efficient and accurate microstructure prediction, with direct implications for digital twin development and process optimization.
Related papers
- SEMPO: Lightweight Foundation Models for Time Series Forecasting [45.456949943052116]
SEMPO is a lightweight foundation model that requires pretraining on relatively small-scale data, yet exhibits strong general time series forecasting.<n> SEMPO comprises two key modules: 1) energy-aware SpEctral decomposition module, that substantially improves the utilization of pre-training data.<n>Experiments on two large-scale benchmarks covering 16 datasets demonstrate the superior performance of SEMPO in both zero-shot and few-shot forecasting scenarios.
arXiv Detail & Related papers (2025-10-22T15:58:44Z) - Short-Term Regional Electricity Demand Forecasting in Argentina Using LSTM Networks [0.0]
This study presents the development and optimization of a deep learning model to predict short-term hourly electricity demand in C'ordoba, Argentina.<n>The model achieved high predictive precision, with a mean absolute percentage error of 3.20% and a determination coefficient of 0.95.
arXiv Detail & Related papers (2025-09-19T19:20:49Z) - rETF-semiSL: Semi-Supervised Learning for Neural Collapse in Temporal Data [44.17657834678967]
We propose a novel semi-supervised pre-training strategy to enforce latent representations that satisfy the Neural Collapse phenomenon.<n>We show that our method significantly outperforms previous pretext tasks when applied to LSTMs, transformers, and state-space models.
arXiv Detail & Related papers (2025-08-13T19:16:47Z) - High-fidelity Grain Growth Modeling: Leveraging Deep Learning for Fast Computations [0.0]
We introduce a machine learning framework that combines a Convolutional Long Short-Term Memory networks with an Autoencoder to efficiently predict grain growth evolution.<n>Results demonstrated that our machine learning approach accelerates grain growth prediction by up to SI89times faster.
arXiv Detail & Related papers (2025-05-08T15:43:40Z) - MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - $\clubsuit$ CLOVER $\clubsuit$: Probabilistic Forecasting with Coherent Learning Objective Reparameterization [42.215158938066054]
We augment an MQForecaster neural network architecture with a modified multivariate Gaussian factor model that achieves coherence by construction.<n>We call our method the Coherent Learning Objective Reparametrization Neural Network (CLOVER)<n>In comparison to state-of-the-art coherent forecasting methods, CLOVER achieves significant improvements in scaled CRPS forecast accuracy, with average gains of 15%.
arXiv Detail & Related papers (2023-07-19T07:31:37Z) - Discovering Predictable Latent Factors for Time Series Forecasting [39.08011991308137]
We develop a novel framework for inferring the intrinsic latent factors implied by the observable time series.
We introduce three characteristics, i.e., predictability, sufficiency, and identifiability, and model these characteristics via the powerful deep latent dynamics models.
Empirical results on multiple real datasets show the efficiency of our method for different kinds of time series forecasting.
arXiv Detail & Related papers (2023-03-18T14:37:37Z) - Probabilistic AutoRegressive Neural Networks for Accurate Long-range
Forecasting [6.295157260756792]
We introduce the Probabilistic AutoRegressive Neural Networks (PARNN)
PARNN is capable of handling complex time series data exhibiting non-stationarity, nonlinearity, non-seasonality, long-range dependence, and chaotic patterns.
We evaluate the performance of PARNN against standard statistical, machine learning, and deep learning models, including Transformers, NBeats, and DeepAR.
arXiv Detail & Related papers (2022-04-01T17:57:36Z) - Deep learning-based multi-output quantile forecasting of PV generation [34.51430520593065]
This paper develops probabilistic PV forecasters by taking advantage of recent breakthroughs in deep learning.
It tailored forecasting tool, named encoder-decoder, is implemented to compute intraday multi-output PV quantiles forecasts.
The models are trained using quantile regression, a non-parametric approach.
arXiv Detail & Related papers (2021-06-02T16:28:10Z) - Dynamic Gaussian Mixture based Deep Generative Model For Robust
Forecasting on Sparse Multivariate Time Series [43.86737761236125]
We propose a novel generative model, which tracks the transition of latent clusters, instead of isolated feature representations.
It is characterized by a newly designed dynamic Gaussian mixture distribution, which captures the dynamics of clustering structures.
A structured inference network is also designed for enabling inductive analysis.
arXiv Detail & Related papers (2021-03-03T04:10:07Z) - An autoencoder wavelet based deep neural network with attention
mechanism for multistep prediction of plant growth [4.077787659104315]
This paper presents a novel approach for predicting plant growth in agriculture, focusing on prediction of plant Stem Diameter Variations (SDV)
Wavelet decomposition is applied to the original data, as to facilitate model fitting and reduce noise in them.
An encoder-decoder framework is developed using Long Short Term Memory (LSTM) and used for appropriate feature extraction from the data.
A recurrent neural network including LSTM and an attention mechanism is proposed for modelling long-term dependencies in the time series data.
arXiv Detail & Related papers (2020-12-07T20:30:39Z) - Industrial Forecasting with Exponentially Smoothed Recurrent Neural
Networks [0.0]
We present a class of exponential smoothed recurrent neural networks (RNNs) which are well suited to modeling non-stationary dynamical systems arising in industrial applications.
Application of exponentially smoothed RNNs to forecasting electricity load, weather data, and stock prices highlight the efficacy of exponential smoothing of the hidden state for multi-step time series forecasting.
arXiv Detail & Related papers (2020-04-09T17:53:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.