Self-supervised learning for crystal property prediction via denoising
- URL: http://arxiv.org/abs/2408.17255v1
- Date: Fri, 30 Aug 2024 12:53:40 GMT
- Title: Self-supervised learning for crystal property prediction via denoising
- Authors: Alexander New, Nam Q. Le, Michael J. Pekala, Christopher D. Stiles,
- Abstract summary: We propose a novel self-supervised learning (SSL) strategy for material property prediction.
Our approach, crystal denoising self-supervised learning (CDSSL), pretrains predictive models with a pretext task based on recovering valid material structures.
We demonstrate that CDSSL models out-perform models trained without SSL, across material types, properties, and dataset sizes.
- Score: 43.148818844265236
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Accurate prediction of the properties of crystalline materials is crucial for targeted discovery, and this prediction is increasingly done with data-driven models. However, for many properties of interest, the number of materials for which a specific property has been determined is much smaller than the number of known materials. To overcome this disparity, we propose a novel self-supervised learning (SSL) strategy for material property prediction. Our approach, crystal denoising self-supervised learning (CDSSL), pretrains predictive models (e.g., graph networks) with a pretext task based on recovering valid material structures when given perturbed versions of these structures. We demonstrate that CDSSL models out-perform models trained without SSL, across material types, properties, and dataset sizes.
Related papers
- Efficient Training of Self-Supervised Speech Foundation Models on a
Compute Budget [57.807614181024114]
This paper investigates how to efficiently train speech foundation models with self-supervised learning (SSL) under a limited compute budget.
We examine critical factors in SSL that impact the budget, including model architecture, model size, and data size.
arXiv Detail & Related papers (2024-09-09T10:36:42Z) - Out-of-distribution materials property prediction using adversarial learning based fine-tuning [0.0]
We propose an adversarial learning based targeting finetuning approach to make the model adapted to a particular dataset.
Our experiments demonstrate the success of our CAL algorithm with its high effectiveness in ML with limited samples.
arXiv Detail & Related papers (2024-08-17T21:22:21Z) - Fine-Tuned Language Models Generate Stable Inorganic Materials as Text [57.01994216693825]
Fine-tuning large language models on text-encoded atomistic data is simple to implement yet reliable.
We show that our strongest model can generate materials predicted to be metastable at about twice the rate of CDVAE.
Because of text prompting's inherent flexibility, our models can simultaneously be used for unconditional generation of stable material.
arXiv Detail & Related papers (2024-02-06T20:35:28Z) - Materials Informatics Transformer: A Language Model for Interpretable
Materials Properties Prediction [6.349503549199403]
We introduce our model Materials Informatics Transformer (MatInFormer) for material property prediction.
Specifically, we introduce a novel approach that involves learning the grammar of crystallography through the tokenization of pertinent space group information.
arXiv Detail & Related papers (2023-08-30T18:34:55Z) - Self-Supervision for Tackling Unsupervised Anomaly Detection: Pitfalls
and Opportunities [50.231837687221685]
Self-supervised learning (SSL) has transformed machine learning and its many real world applications.
Unsupervised anomaly detection (AD) has also capitalized on SSL, by self-generating pseudo-anomalies.
arXiv Detail & Related papers (2023-08-28T07:55:01Z) - CrysGNN : Distilling pre-trained knowledge to enhance property
prediction for crystalline materials [25.622724168215097]
This paper presents CrysGNN, a new pre-trained GNN framework for crystalline materials.
It captures both node and graph level structural information of crystal graphs using unlabelled material data.
We conduct extensive experiments to show that with distilled knowledge from the pre-trained model, all the SOTA algorithms are able to outperform their own vanilla version with good margins.
arXiv Detail & Related papers (2023-01-14T08:12:01Z) - Graph Contrastive Learning for Materials [6.667711415870472]
We introduce CrystalCLR, a framework for constrastive learning of representations with crystal graph neural networks.
With the addition of a novel loss function, our framework is able to learn representations competitive with engineered fingerprinting methods.
We also demonstrate that via model finetuning, contrastive pretraining can improve the performance of graph neural networks for prediction of material properties.
arXiv Detail & Related papers (2022-11-24T04:15:47Z) - The Geometry of Self-supervised Learning Models and its Impact on
Transfer Learning [62.601681746034956]
Self-supervised learning (SSL) has emerged as a desirable paradigm in computer vision.
We propose a data-driven geometric strategy to analyze different SSL models using local neighborhoods in the feature space induced by each.
arXiv Detail & Related papers (2022-09-18T18:15:38Z) - Pre-training via Denoising for Molecular Property Prediction [53.409242538744444]
We describe a pre-training technique that utilizes large datasets of 3D molecular structures at equilibrium.
Inspired by recent advances in noise regularization, our pre-training objective is based on denoising.
arXiv Detail & Related papers (2022-05-31T22:28:34Z) - Crystal Twins: Self-supervised Learning for Crystalline Material
Property Prediction [8.048439531116367]
We introduce Crystal Twins (CT): an SSL method for crystalline materials property prediction.
We pre-train a Graph Neural Network (GNN) by applying the redundancy reduction principle to the graph latent embeddings of augmented instances.
By sharing the pre-trained weights when fine-tuning the GNN for regression tasks, we significantly improve the performance for 7 challenging material property prediction benchmarks.
arXiv Detail & Related papers (2022-05-04T05:08:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.