Fully Convolutional Spatiotemporal Learning for Microstructure Evolution Prediction
- URL: http://arxiv.org/abs/2602.19915v1
- Date: Mon, 23 Feb 2026 14:55:28 GMT
- Title: Fully Convolutional Spatiotemporal Learning for Microstructure Evolution Prediction
- Authors: Michael Trimboli, Mohammed Alsubaie, Sirani M. Perera, Ke-Gang Wang, Xianqi Li,
- Abstract summary: Traditional simulation methods are expensive due to the need to solve complex partial differential equations at fine resolutions.<n>We propose a deep learning framework that governs microstructural evolution predictions while maintaining high accuracy.<n>Compared to recurrent neural architectures, our model state-of-the-art predictive performance with significantly reduced computational cost in both training and inference.
- Score: 0.5437050212139087
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Understanding and predicting microstructure evolution is fundamental to materials science, as it governs the resulting properties and performance of materials. Traditional simulation methods, such as phase-field models, offer high-fidelity results but are computationally expensive due to the need to solve complex partial differential equations at fine spatiotemporal resolutions. To address this challenge, we propose a deep learning-based framework that accelerates microstructure evolution predictions while maintaining high accuracy. Our approach utilizes a fully convolutional spatiotemporal model trained in a self-supervised manner using sequential images generated from simulations of microstructural processes, including grain growth and spinodal decomposition. The trained neural network effectively learns the underlying physical dynamics and can accurately capture both short-term local behaviors and long-term statistical properties of evolving microstructures, while also demonstrating generalization to unseen spatiotemporal domains and variations in configuration and material parameters. Compared to recurrent neural architectures, our model achieves state-of-the-art predictive performance with significantly reduced computational cost in both training and inference. This work establishes a robust baseline for spatiotemporal learning in materials science and offers a scalable, data-driven alternative for fast and reliable microstructure simulations.
Related papers
- Scalable Spatio-Temporal SE(3) Diffusion for Long-Horizon Protein Dynamics [51.85385061275941]
Molecular dynamics (MD) simulations remain the gold standard for studying protein dynamics.<n>Recent generative models have shown promise in accelerating simulations, yet they struggle with long-horizon generation.<n>We present STAR-MD, a scalable diffusion model that generates physically plausible protein trajectories over micro-scale timescales.
arXiv Detail & Related papers (2026-02-02T14:13:28Z) - A Physics-Informed U-net-LSTM Network for Data-Driven Seismic Response Modeling of Structures [0.0]
Recent developments in deep learning have shown promise in reducing the computational cost of nonlinear seismic analysis of structures.<n>We propose a novel Physics Informed U Net LSTM framework that integrates physical laws with deep learning to enhance both accuracy and efficiency.
arXiv Detail & Related papers (2025-11-26T11:05:42Z) - Deep Learning-Driven Prediction of Microstructure Evolution via Latent Space Interpolation [0.0]
Phase-field models accurately simulate microstructure evolution, but their dependence on solving complex differential equations makes them computationally expensive.<n>This work achieves a significant acceleration via a novel deep learning-based framework, utilizing a Variational Autoencoder (CVAE) coupled with Cubic Spline Interpolation and Spherical Linear Interpolation (SLERP)<n>We demonstrate the method for binary spinodal decomposition by predicting microstructure evolution for intermediate alloy compositions from a limited set of training compositions.
arXiv Detail & Related papers (2025-08-03T16:22:15Z) - evoxels: A differentiable physics framework for voxel-based microstructure simulations [41.94295877935867]
Differentiable physics framework evoxels is based on a fully Pythonic, unified voxel-based approach that integrates segmented 3D microscopy data, physical simulations, inverse modeling, and machine learning.
arXiv Detail & Related papers (2025-07-29T12:29:15Z) - Computational, Data-Driven, and Physics-Informed Machine Learning Approaches for Microstructure Modeling in Metal Additive Manufacturing [0.0]
Metal additive manufacturing enables unprecedented design freedom and the production of customized, complex components.<n>The rapid melting and solidification dynamics inherent to metal AM processes generate heterogeneous, non-equilibrium microstructures.<n>Predicting microstructure and its evolution across spatial and temporal scales remains a central challenge for process optimization and defect mitigation.
arXiv Detail & Related papers (2025-05-02T17:59:54Z) - Teaching Artificial Intelligence to Perform Rapid, Resolution-Invariant Grain Growth Modeling via Fourier Neural Operator [0.0]
Microstructural evolution plays a critical role in shaping the physical, optical, and electronic properties of materials.<n>Traditional phase-field modeling accurately simulates these phenomena but is computationally intensive.<n>This study introduces a novel approach utilizing Fourier Neural Operator (FNO) to achieve resolution-invariant modeling.
arXiv Detail & Related papers (2025-03-18T11:19:08Z) - Generalized Factor Neural Network Model for High-dimensional Regression [50.554377879576066]
We tackle the challenges of modeling high-dimensional data sets with latent low-dimensional structures hidden within complex, non-linear, and noisy relationships.<n>Our approach enables a seamless integration of concepts from non-parametric regression, factor models, and neural networks for high-dimensional regression.
arXiv Detail & Related papers (2025-02-16T23:13:55Z) - A Realistic Simulation Framework for Analog/Digital Neuromorphic Architectures [73.65190161312555]
ARCANA is a software spiking neural network simulator designed to account for the properties of mixed-signal neuromorphic circuits.<n>We show how the results obtained provide a reliable estimate of the behavior of the spiking neural network trained in software, once deployed in hardware.
arXiv Detail & Related papers (2024-09-23T11:16:46Z) - Extreme time extrapolation capabilities and thermodynamic consistency of physics-inspired Neural Networks for the 3D microstructure evolution of materials via Cahn-Hilliard flow [0.0]
A Convolutional Recurrent Neural Network (CRNN) is trained to reproduce the evolution of the spinodal decomposition process in three dimensions.
A specialized, physics-inspired architecture is proven to provide close accordance between the predicted evolutions and the ground truth ones.
arXiv Detail & Related papers (2024-07-29T15:55:52Z) - Fast and Reliable Probabilistic Reflectometry Inversion with Prior-Amortized Neural Posterior Estimation [73.81105275628751]
Finding all structures compatible with reflectometry data is computationally prohibitive for standard algorithms.
We address this lack of reliability with a probabilistic deep learning method that identifies all realistic structures in seconds.
Our method, Prior-Amortized Neural Posterior Estimation (PANPE), combines simulation-based inference with novel adaptive priors.
arXiv Detail & Related papers (2024-07-26T10:29:16Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.