Graph Contrastive Learning for Materials
- URL: http://arxiv.org/abs/2211.13408v1
- Date: Thu, 24 Nov 2022 04:15:47 GMT
- Title: Graph Contrastive Learning for Materials
- Authors: Teddy Koker, Keegan Quigley, Will Spaeth, Nathan C. Frey, Lin Li
- Abstract summary: We introduce CrystalCLR, a framework for constrastive learning of representations with crystal graph neural networks.
With the addition of a novel loss function, our framework is able to learn representations competitive with engineered fingerprinting methods.
We also demonstrate that via model finetuning, contrastive pretraining can improve the performance of graph neural networks for prediction of material properties.
- Score: 6.667711415870472
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent work has shown the potential of graph neural networks to efficiently
predict material properties, enabling high-throughput screening of materials.
Training these models, however, often requires large quantities of labelled
data, obtained via costly methods such as ab initio calculations or
experimental evaluation. By leveraging a series of material-specific
transformations, we introduce CrystalCLR, a framework for constrastive learning
of representations with crystal graph neural networks. With the addition of a
novel loss function, our framework is able to learn representations competitive
with engineered fingerprinting methods. We also demonstrate that via model
finetuning, contrastive pretraining can improve the performance of graph neural
networks for prediction of material properties and significantly outperform
traditional ML models that use engineered fingerprints. Lastly, we observe that
CrystalCLR produces material representations that form clusters by compound
class.
Related papers
- Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - Substitutional Alloying Using Crystal Graph Neural Networks [0.0]
Graph Neural Networks (GNNs) allow for direct learning representations on graphs, such as the ones formed by crystals.
We use CGNNs to predict crystal properties with DFT level accuracy, through graphs with encoding of the atomic (node/vertex), bond (edge), and global state attributes.
We perform DFT validation to assess the accuracy in the prediction of formation energies and structural features.
arXiv Detail & Related papers (2023-06-19T08:18:17Z) - Robust Graph Representation Learning via Predictive Coding [46.22695915912123]
Predictive coding is a message-passing framework initially developed to model information processing in the brain.
In this work, we build models that rely on the message-passing rule of predictive coding.
We show that the proposed models are comparable to standard ones in terms of performance in both inductive and transductive tasks.
arXiv Detail & Related papers (2022-12-09T03:58:22Z) - NCTV: Neural Clamping Toolkit and Visualization for Neural Network
Calibration [66.22668336495175]
A lack of consideration for neural network calibration will not gain trust from humans.
We introduce the Neural Clamping Toolkit, the first open-source framework designed to help developers employ state-of-the-art model-agnostic calibrated models.
arXiv Detail & Related papers (2022-11-29T15:03:05Z) - An Adversarial Active Sampling-based Data Augmentation Framework for
Manufacturable Chip Design [55.62660894625669]
Lithography modeling is a crucial problem in chip design to ensure a chip design mask is manufacturable.
Recent developments in machine learning have provided alternative solutions in replacing the time-consuming lithography simulations with deep neural networks.
We propose a litho-aware data augmentation framework to resolve the dilemma of limited data and improve the machine learning model performance.
arXiv Detail & Related papers (2022-10-27T20:53:39Z) - Data-Free Adversarial Knowledge Distillation for Graph Neural Networks [62.71646916191515]
We propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN)
To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model.
Our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.
arXiv Detail & Related papers (2022-05-08T08:19:40Z) - Crystal Twins: Self-supervised Learning for Crystalline Material
Property Prediction [8.048439531116367]
We introduce Crystal Twins (CT): an SSL method for crystalline materials property prediction.
We pre-train a Graph Neural Network (GNN) by applying the redundancy reduction principle to the graph latent embeddings of augmented instances.
By sharing the pre-trained weights when fine-tuning the GNN for regression tasks, we significantly improve the performance for 7 challenging material property prediction benchmarks.
arXiv Detail & Related papers (2022-05-04T05:08:46Z) - Scalable deeper graph neural networks for high-performance materials
property prediction [1.9129213267332026]
We propose a novel graph attention neural network model DeeperGATGNN with differenti groupable normalization and skip-connections.
Our work shows that to deal with the high complexity of mapping the crystal materials structures to their properties, large-scale very deep graph neural networks are needed to achieve robust performances.
arXiv Detail & Related papers (2021-09-25T05:58:04Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Global Attention based Graph Convolutional Neural Networks for Improved
Materials Property Prediction [8.371766047183739]
We develop a novel model, GATGNN, for predicting inorganic material properties based on graph neural networks.
We show that our method is able to both outperform the previous models' predictions and provide insight into the crystallization of the material.
arXiv Detail & Related papers (2020-03-11T07:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.