Substitutional Alloying Using Crystal Graph Neural Networks
- URL: http://arxiv.org/abs/2306.10766v1
- Date: Mon, 19 Jun 2023 08:18:17 GMT
- Title: Substitutional Alloying Using Crystal Graph Neural Networks
- Authors: Dario Massa, Daniel Cie\'sli\'nski, Amirhossein Naghdi and Stefanos
Papanikolaou
- Abstract summary: Graph Neural Networks (GNNs) allow for direct learning representations on graphs, such as the ones formed by crystals.
We use CGNNs to predict crystal properties with DFT level accuracy, through graphs with encoding of the atomic (node/vertex), bond (edge), and global state attributes.
We perform DFT validation to assess the accuracy in the prediction of formation energies and structural features.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Materials discovery, especially for applications that require extreme
operating conditions, requires extensive testing that naturally limits the
ability to inquire the wealth of possible compositions. Machine Learning (ML)
has nowadays a well established role in facilitating this effort in systematic
ways. The increasing amount of available accurate DFT data represents a solid
basis upon which new ML models can be trained and tested. While conventional
models rely on static descriptors, generally suitable for a limited class of
systems, the flexibility of Graph Neural Networks (GNNs) allows for direct
learning representations on graphs, such as the ones formed by crystals. We
utilize crystal graph neural networks (CGNN) to predict crystal properties with
DFT level accuracy, through graphs with encoding of the atomic (node/vertex),
bond (edge), and global state attributes. In this work, we aim at testing the
ability of the CGNN MegNet framework in predicting a number of properties of
systems previously unseen from the model, obtained by adding a substitutional
defect in bulk crystals that are included in the training set. We perform DFT
validation to assess the accuracy in the prediction of formation energies and
structural features (such as elastic moduli). Using CGNNs, one may identify
promising paths in alloy discovery.
Related papers
- chemtrain: Learning Deep Potential Models via Automatic Differentiation and Statistical Physics [0.0]
Neural Networks (NNs) are promising models for refining the accuracy of molecular dynamics.
Chemtrain is a framework to learn sophisticated NN potential models through customizable training routines and advanced training algorithms.
arXiv Detail & Related papers (2024-08-28T15:14:58Z) - Kalman Filter for Online Classification of Non-Stationary Data [101.26838049872651]
In Online Continual Learning (OCL) a learning system receives a stream of data and sequentially performs prediction and training steps.
We introduce a probabilistic Bayesian online learning model by using a neural representation and a state space model over the linear predictor weights.
In experiments in multi-class classification we demonstrate the predictive ability of the model and its flexibility to capture non-stationarity.
arXiv Detail & Related papers (2023-06-14T11:41:42Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - CrysGNN : Distilling pre-trained knowledge to enhance property
prediction for crystalline materials [25.622724168215097]
This paper presents CrysGNN, a new pre-trained GNN framework for crystalline materials.
It captures both node and graph level structural information of crystal graphs using unlabelled material data.
We conduct extensive experiments to show that with distilled knowledge from the pre-trained model, all the SOTA algorithms are able to outperform their own vanilla version with good margins.
arXiv Detail & Related papers (2023-01-14T08:12:01Z) - Graph Contrastive Learning for Materials [6.667711415870472]
We introduce CrystalCLR, a framework for constrastive learning of representations with crystal graph neural networks.
With the addition of a novel loss function, our framework is able to learn representations competitive with engineered fingerprinting methods.
We also demonstrate that via model finetuning, contrastive pretraining can improve the performance of graph neural networks for prediction of material properties.
arXiv Detail & Related papers (2022-11-24T04:15:47Z) - Data-Free Adversarial Knowledge Distillation for Graph Neural Networks [62.71646916191515]
We propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN)
To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model.
Our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.
arXiv Detail & Related papers (2022-05-08T08:19:40Z) - Crystal Twins: Self-supervised Learning for Crystalline Material
Property Prediction [8.048439531116367]
We introduce Crystal Twins (CT): an SSL method for crystalline materials property prediction.
We pre-train a Graph Neural Network (GNN) by applying the redundancy reduction principle to the graph latent embeddings of augmented instances.
By sharing the pre-trained weights when fine-tuning the GNN for regression tasks, we significantly improve the performance for 7 challenging material property prediction benchmarks.
arXiv Detail & Related papers (2022-05-04T05:08:46Z) - Pretraining Graph Neural Networks for few-shot Analog Circuit Modeling
and Design [68.1682448368636]
We present a supervised pretraining approach to learn circuit representations that can be adapted to new unseen topologies or unseen prediction tasks.
To cope with the variable topological structure of different circuits we describe each circuit as a graph and use graph neural networks (GNNs) to learn node embeddings.
We show that pretraining GNNs on prediction of output node voltages can encourage learning representations that can be adapted to new unseen topologies or prediction of new circuit level properties.
arXiv Detail & Related papers (2022-03-29T21:18:47Z) - Meta-learning using privileged information for dynamics [66.32254395574994]
We extend the Neural ODE Process model to use additional information within the Learning Using Privileged Information setting.
We validate our extension with experiments showing improved accuracy and calibration on simulated dynamics tasks.
arXiv Detail & Related papers (2021-04-29T12:18:02Z) - ForceNet: A Graph Neural Network for Large-Scale Quantum Calculations [86.41674945012369]
We develop a scalable and expressive Graph Neural Networks model, ForceNet, to approximate atomic forces.
Our proposed ForceNet is able to predict atomic forces more accurately than state-of-the-art physics-based GNNs.
arXiv Detail & Related papers (2021-03-02T03:09:06Z) - A deep learning framework for solution and discovery in solid mechanics [1.4699455652461721]
We present the application of a class of deep learning, known as Physics Informed Neural Networks (PINN), to learning and discovery in solid mechanics.
We explain how to incorporate the momentum balance and elasticity relations into PINN, and explore in detail the application to linear elasticity.
arXiv Detail & Related papers (2020-02-14T08:24:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.