Predicting and Interpreting Energy Barriers of Metallic Glasses with Graph Neural Networks
- URL: http://arxiv.org/abs/2401.08627v3
- Date: Wed, 4 Sep 2024 03:53:59 GMT
- Title: Predicting and Interpreting Energy Barriers of Metallic Glasses with Graph Neural Networks
- Authors: Haoyu Li, Shichang Zhang, Longwen Tang, Mathieu Bauchy, Yizhou Sun,
- Abstract summary: Metallic Glasses (MGs) are widely used materials that are stronger than steel while being shapeable as plastic.
We utilize Graph Neural Networks (GNNs) to model MGs and study EBs.
We contribute a new dataset for EB prediction and a novel Symmetrized GNN (SymGNN) model that is E(3)-invariant in expectation.
- Score: 40.33075223431487
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Metallic Glasses (MGs) are widely used materials that are stronger than steel while being shapeable as plastic. While understanding the structure-property relationship of MGs remains a challenge in materials science, studying their energy barriers (EBs) as an intermediary step shows promise. In this work, we utilize Graph Neural Networks (GNNs) to model MGs and study EBs. We contribute a new dataset for EB prediction and a novel Symmetrized GNN (SymGNN) model that is E(3)-invariant in expectation. SymGNN handles invariance by aggregating over orthogonal transformations of the graph structure. When applied to EB prediction, SymGNN are more accurate than molecular dynamics (MD) local-sampling methods and other machine-learning models. Compared to precise MD simulations, SymGNN reduces the inference time on new MGs from roughly 41 days to less than one second. We apply explanation algorithms to reveal the relationship between structures and EBs. The structures that we identify through explanations match the medium-range order (MRO) hypothesis and possess unique topological properties. Our work enables effective prediction and interpretation of MG EBs, bolstering material science research.
Related papers
- Graph neural network framework for energy mapping of hybrid monte-carlo molecular dynamics simulations of Medium Entropy Alloys [0.0]
The present study proposes a graph-based representation for modeling medium-entropy alloys (MEAs)
Hybrid Monte-Carlo molecular dynamics (MC/MD) simulations are employed to achieve thermally stable structures across various annealing temperatures in an MEA.
These simulations generate dump files and potential energy labels, which are used to construct graph representations of the atomic configurations.
These graphs then serve as input for a Graph Convolutional Neural Network (GCNN) based ML model to predict the system's potential energy.
arXiv Detail & Related papers (2024-11-20T19:22:40Z) - Do Graph Neural Networks Work for High Entropy Alloys? [12.002942104379986]
High-entropy alloys (HEAs) lack chemical long-range order, limiting the applicability of current graph representations.
We introduce the LESets machine learning model, an accurate, interpretable GNN for HEA property prediction.
We demonstrate the accuracy of LESets in modeling the mechanical properties ofquaternary HEAs.
arXiv Detail & Related papers (2024-08-29T08:20:02Z) - Band-gap regression with architecture-optimized message-passing neural
networks [1.9590152885845324]
We train an MPNN to first classify materials through density functional theory data from the AFLOW database as being metallic or semiconducting/insulating.
We then perform a neural-architecture search to explore the model architecture and hyper parameter space of MPNNs to predict the band gaps of the materials identified as non-metals.
The top-performing models from the search are pooled into an ensemble that significantly outperforms existing models from the literature.
arXiv Detail & Related papers (2023-09-12T16:13:10Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Formula graph self-attention network for representation-domain
independent materials discovery [3.67735033631952]
We introduce a new concept of formula graph which unifies both stoichiometry-only and structure-based material descriptors.
We develop a self-attention integrated GNN that assimilates a formula graph and show that the proposed architecture produces material embeddings transferable between the two domains.
Our model substantially outperforms previous structure-based GNNs as well as structure-agnostic counterparts.
arXiv Detail & Related papers (2022-01-14T19:49:45Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z) - Global Attention based Graph Convolutional Neural Networks for Improved
Materials Property Prediction [8.371766047183739]
We develop a novel model, GATGNN, for predicting inorganic material properties based on graph neural networks.
We show that our method is able to both outperform the previous models' predictions and provide insight into the crystallization of the material.
arXiv Detail & Related papers (2020-03-11T07:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.