Predicting Oxide Glass Properties with Low Complexity Neural Network and
Physical and Chemical Descriptors
- URL: http://arxiv.org/abs/2210.10507v1
- Date: Wed, 19 Oct 2022 12:23:30 GMT
- Title: Predicting Oxide Glass Properties with Low Complexity Neural Network and
Physical and Chemical Descriptors
- Authors: Suresh Bishnoi, Skyler Badge, Jayadeva and N. M. Anoop Krishnan
- Abstract summary: We present a low complexity neural network (LCNN) that provides improved performance in predicting the properties of oxide glasses.
By training on a large dataset (50000) of glass components, we show the LCNN outperforms state-of-the-art algorithms such as XGBoost.
We demonstrate the universality of the LCNN models by predicting the properties for glasses with new components that were not present in the original training set.
- Score: 1.8734449181723827
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Due to their disordered structure, glasses present a unique challenge in
predicting the composition-property relationships. Recently, several attempts
have been made to predict the glass properties using machine learning
techniques. However, these techniques have the limitations, namely, (i)
predictions are limited to the components that are present in the original
dataset, and (ii) predictions towards the extreme values of the properties,
important regions for new materials discovery, are not very reliable due to the
sparse datapoints in this region. To address these challenges, here we present
a low complexity neural network (LCNN) that provides improved performance in
predicting the properties of oxide glasses. In addition, we combine the LCNN
with physical and chemical descriptors that allow the development of universal
models that can provide predictions for components beyond the training set. By
training on a large dataset (~50000) of glass components, we show the LCNN
outperforms state-of-the-art algorithms such as XGBoost. In addition, we
interpret the LCNN models using Shapely additive explanations to gain insights
into the role played by the descriptors in governing the property. Finally, we
demonstrate the universality of the LCNN models by predicting the properties
for glasses with new components that were not present in the original training
set. Altogether, the present approach provides a promising direction towards
accelerated discovery of novel glass compositions.
Related papers
- Cross-Modal Learning for Chemistry Property Prediction: Large Language Models Meet Graph Machine Learning [0.0]
We introduce a Multi-Modal Fusion (MMF) framework that harnesses the analytical prowess of Graph Neural Networks (GNNs) and the linguistic generative and predictive abilities of Large Language Models (LLMs)
Our framework combines the effectiveness of GNNs in modeling graph-structured data with the zero-shot and few-shot learning capabilities of LLMs, enabling improved predictions while reducing the risk of overfitting.
arXiv Detail & Related papers (2024-08-27T11:10:39Z) - Visual Prompting Upgrades Neural Network Sparsification: A Data-Model Perspective [64.04617968947697]
We introduce a novel data-model co-design perspective: to promote superior weight sparsity.
Specifically, customized Visual Prompts are mounted to upgrade neural Network sparsification in our proposed VPNs framework.
arXiv Detail & Related papers (2023-12-03T13:50:24Z) - Substitutional Alloying Using Crystal Graph Neural Networks [0.0]
Graph Neural Networks (GNNs) allow for direct learning representations on graphs, such as the ones formed by crystals.
We use CGNNs to predict crystal properties with DFT level accuracy, through graphs with encoding of the atomic (node/vertex), bond (edge), and global state attributes.
We perform DFT validation to assess the accuracy in the prediction of formation energies and structural features.
arXiv Detail & Related papers (2023-06-19T08:18:17Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - Graph Contrastive Learning for Materials [6.667711415870472]
We introduce CrystalCLR, a framework for constrastive learning of representations with crystal graph neural networks.
With the addition of a novel loss function, our framework is able to learn representations competitive with engineered fingerprinting methods.
We also demonstrate that via model finetuning, contrastive pretraining can improve the performance of graph neural networks for prediction of material properties.
arXiv Detail & Related papers (2022-11-24T04:15:47Z) - Multi-Task Mixture Density Graph Neural Networks for Predicting Cu-based
Single-Atom Alloy Catalysts for CO2 Reduction Reaction [61.9212585617803]
Graph neural networks (GNNs) have drawn more and more attention from material scientists.
We develop a multi-task (MT) architecture based on DimeNet++ and mixture density networks to improve the performance of such task.
arXiv Detail & Related papers (2022-09-15T13:52:15Z) - On the Intrinsic Structures of Spiking Neural Networks [66.57589494713515]
Recent years have emerged a surge of interest in SNNs owing to their remarkable potential to handle time-dependent and event-driven data.
There has been a dearth of comprehensive studies examining the impact of intrinsic structures within spiking computations.
This work delves deep into the intrinsic structures of SNNs, by elucidating their influence on the expressivity of SNNs.
arXiv Detail & Related papers (2022-06-21T09:42:30Z) - Prediction of the electron density of states for crystalline compounds
with Atomistic Line Graph Neural Networks (ALIGNN) [0.0]
We present an extension of the recently developed Atomistic Line Graph Neural Network (ALIGNN) to accurately predict DOS of a large set of material unit cell structures.
We evaluate two methods of representation of the target quantity - a direct discretized spectrum, and a compressed low-dimensional representation obtained using an autoencoder.
arXiv Detail & Related papers (2022-01-20T18:28:22Z) - BScNets: Block Simplicial Complex Neural Networks [79.81654213581977]
Simplicial neural networks (SNN) have recently emerged as the newest direction in graph learning.
We present Block Simplicial Complex Neural Networks (BScNets) model for link prediction.
BScNets outperforms state-of-the-art models by a significant margin while maintaining low costs.
arXiv Detail & Related papers (2021-12-13T17:35:54Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Orbital Graph Convolutional Neural Network for Material Property
Prediction [0.0]
We propose the Orbital Graph Convolutional Neural Network (OGCNN), a crystal graph convolutional neural network framework.
OGCNN includes atomic orbital interaction features that learn material properties in a robust way.
We examined the performance of this model on a broad range of crystalline material data to predict different properties.
arXiv Detail & Related papers (2020-08-14T15:22:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.